<?xml version="1.0" encoding="utf-8"?><!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.0 20120330//EN" "JATS-journalpublishing1.dtd"><article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="research-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">INFORMATICA</journal-id>
<journal-title-group><journal-title>Informatica</journal-title></journal-title-group>
<issn pub-type="epub">1822-8844</issn><issn pub-type="ppub">0868-4952</issn><issn-l>0868-4952</issn-l>
<publisher>
<publisher-name>Vilnius University</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="publisher-id">INFO1205</article-id>
<article-id pub-id-type="doi">10.15388/Informatica.2018.191</article-id>
<article-categories><subj-group subj-group-type="heading">
<subject>Research Article</subject></subj-group></article-categories>
<title-group>
<article-title>Adaptive Eye Fundus Vessel Classification for Automatic Artery and Vein Diameter Ratio Evaluation</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name><surname>Stabingis</surname><given-names>Giedrius</given-names></name><email xlink:href="giedrius.stabingis@ku.lt">giedrius.stabingis@ku.lt</email><xref ref-type="aff" rid="j_info1205_aff_001">1</xref><xref ref-type="aff" rid="j_info1205_aff_002">2</xref><xref ref-type="corresp" rid="cor1">∗</xref><bio>
<p><bold>G. Stabingis</bold> received BA in informatics in 2005 and MS in statistics in 2011 from Klaipeda University, Lithuania. He is an assistant professor in Informatics and Statistics Department of Klaipeda University. Currently he is a PhD student in Vilnius University, Institute of Data Science and Digital Technologies. His interests include image analysis methods, image processing, classification methods, spatial statistics and spatial data mining.</p></bio>
</contrib>
<contrib contrib-type="author">
<name><surname>Bernatavičienė</surname><given-names>Jolita</given-names></name><email xlink:href="jolita.bernataviciene@mii.vu.lt">jolita.bernataviciene@mii.vu.lt</email><xref ref-type="aff" rid="j_info1205_aff_001">1</xref><bio>
<p><bold>J. Bernatavičienė</bold> graduated from Vilnius Pedagogical University in 2004 and received a master’s degree in informatics. In 2008, she received the doctoral degree in computer science (PhD) from Institute of Mathematics and Informatics jointly with Vilnius Gediminas Technical University. She is a researcher at Cognitive Computing Group of Vilnius University, Institute of Data Science and Digital Technologies. Her research interests include databases, data mining, neural networks, image analysis, visualization, decision support systems and internet technologies.</p></bio>
</contrib>
<contrib contrib-type="author">
<name><surname>Dzemyda</surname><given-names>Gintautas</given-names></name><email xlink:href="gintautas.dzemyda@mii.vu.lt">gintautas.dzemyda@mii.vu.lt</email><xref ref-type="aff" rid="j_info1205_aff_001">1</xref><bio>
<p><bold>G. Dzemyda</bold> received the doctoral degree in technical sciences (PhD) in 1984, and he received the degree of Doctor Habilius in 1997 from Kaunas University of Technology. He was conferred the title of professor (1998) at Kaunas University of Technology. Recent employment is at Vilnius University, Institute of Data Science and Digital Technologies, as the director of the Institute, a head of Cognitive Computing Group and Principal Researcher. The research interests cover visualization of multidimensional data, optimization theory and applications, data mining in databases, multiple criteria decision support, neural networks, parallel optimization, image analysis. The author of more than 240 scientific publications, two monographs, five textbooks. Editor in chief of the international journals <italic>Informatica</italic> and <italic>Baltic Journal of Modern Computing</italic>. Member of editorial boards of seven international journals.</p></bio>
</contrib>
<contrib contrib-type="author">
<name><surname>Paunksnis</surname><given-names>Alvydas</given-names></name><email xlink:href="alvydas@stratelus.com">alvydas@stratelus.com</email><xref ref-type="aff" rid="j_info1205_aff_003">3</xref><xref ref-type="aff" rid="j_info1205_aff_004">4</xref><bio>
<p><bold>A. Paunksnis</bold> graduated from Kaunas University of Medicine, Lithuania. Medical doctor since June, 1969. DrSci (habil.) since 1993, professor of ophthalmology since 1997. He is the director of the Department of Ophthalmology, Institute for Biomedical Research of Kaunas University of Medicine since July, 1992. Member of the Council of European Ophthalmologist Society since 1995, a full member of the International Society of Experimental Eye Research since 1989, president of the Lithuanian Ophthalmologist Society 1993 to 1997, chairman of the Council of Kaunas University of Medicine 1997 to 2000, coordinator of the Telemedicine Project Group of Kaunas University of Medicine since June 1999, head of the Telemedicine Center of Kaunas University of Medicine since 2002. His present research areas include non-invasive ultrasound examination in ophthalmology, ophthalmooncology, epidemiology, telemedicine.</p></bio>
</contrib>
<contrib contrib-type="author">
<name><surname>Stabingienė</surname><given-names>Lijana</given-names></name><email xlink:href="lijana.stabingiene@ku.lt">lijana.stabingiene@ku.lt</email><xref ref-type="aff" rid="j_info1205_aff_002">2</xref><bio>
<p><bold>L. Stabingienė</bold> graduated from Klaipeda University, Lithuania, in 2007 and received a master’s degree in system research. In 2012 she received the doctoral degree in computer science (PhD) from Vilnius University, Institute of Mathematics and Informatics. She is an associated professor in Informatics and Statistics Department of Klaipeda University. Present research interests include classification of spatially correlated data, geostatistics, image analysis and spatial data mining.</p></bio>
</contrib>
<contrib contrib-type="author">
<name><surname>Treigys</surname><given-names>Povilas</given-names></name><email xlink:href="povilas.treigys@mii.vu.lt">povilas.treigys@mii.vu.lt</email><xref ref-type="aff" rid="j_info1205_aff_001">1</xref><bio>
<p><bold>P. Treigys</bold> graduated from Vilnius Gediminas Technical University, Lithuania, in 2005. In 2010 he received the doctoral degree in computer science (PhD) from Institute of Mathematics and Informatics jointly with Vilnius Gediminas Technical University. Recent employment is at Vilnius University, Institute of Data Science and Digital Technologies, as senior researcher, associate professor and a head of Image and Signal Analysis Group. He is vice-dean for Information Technologies at Vilnius University, Faculty of Mathematics and Informatics. He is a member of the Lithuanian Society for biomedical engineering. His interests include image analysis, detection and object’s feature extraction in image processing, automated image objects segmentation, optimization methods, artificial neural networks and software engineering.</p></bio>
</contrib>
<contrib contrib-type="author">
<name><surname>Vaičaitienė</surname><given-names>Ramutė</given-names></name><email xlink:href="ramute.vaicaitiene@lka.lt">ramute.vaicaitiene@lka.lt</email><xref ref-type="aff" rid="j_info1205_aff_005">5</xref><bio>
<p><bold>R. Vaičaitienė</bold> graduated from Kaunas University of Medicine, Lithuania. Medical doctor since 2000. In 2010 she received master’s degree in health psychology. From 1993 she works at Jonas Basanavičius Military Medical Service. From 2013 she works as an associated professor in General Jonas Žemaitis Military Academy of Lithuania. Her present research areas include psychological well-being, stress management, stress risk factors, military psychological resilience, peculiarities of suicide prevention in the military, health psychology, neuroimmunology, correction of psychosomatic disorders using psychological methods.</p></bio>
</contrib>
<aff id="j_info1205_aff_001"><label>1</label>Institute of Data Science and Digital Technologies, <institution>Vilnius University</institution>, Akademijos st. 4, Vilnius, <country>Lithuania</country></aff>
<aff id="j_info1205_aff_002"><label>2</label>Department of Computer Science and Statistics, <institution>Klaipeda University</institution>, H. Manto st. 84, Klaipeda, <country>Lithuania</country></aff>
<aff id="j_info1205_aff_003"><label>3</label><institution>Kaunas State Hospital</institution>, Hipodromo st. 13, Kaunas, <country>Lithuania</country></aff>
<aff id="j_info1205_aff_004"><label>4</label><institution>Telemedicine Research Centre</institution>, Vytauto ave. 27, Kaunas, <country>Lithuania</country></aff>
<aff id="j_info1205_aff_005"><label>5</label><institution>General Jonas Žemaitis Military Academy of Lithuania</institution>, Šilo Str. 5A, Vilnius, <country>Lithuania</country></aff>
</contrib-group>
<author-notes>
<corresp id="cor1"><label>∗</label>Corresponding author.</corresp>
</author-notes>
<pub-date pub-type="ppub"><year>2018</year></pub-date><pub-date pub-type="epub"><day>1</day><month>1</month><year>2018</year></pub-date><volume>29</volume><issue>4</issue><fpage>757</fpage><lpage>771</lpage><history><date date-type="received"><month>6</month><year>2018</year></date><date date-type="accepted"><month>12</month><year>2018</year></date></history>
<permissions><copyright-statement>© 2018 Vilnius University</copyright-statement><copyright-year>2018</copyright-year>
<license license-type="open-access" xlink:href="http://creativecommons.org/licenses/by/4.0/">
<license-p>Open access article under the <ext-link ext-link-type="uri" xlink:href="http://creativecommons.org/licenses/by/4.0/">CC BY</ext-link> license.</license-p></license></permissions>
<abstract>
<p>Eye fundus imaging is a useful, non-invasive tool in disease progress tracking, in early detection of disease and other cases. Often, the disease diagnosis is made by an ophthalmologist and automatic analysis systems are used only for support. There are several commonly used features for disease detection, one of them is the artery and vein ratio measured according to the width of the main vessels. Arteries must be separated from veins automatically in order to calculate the ratio, therefore, vessel classification is a vital step. For most analysis methods high quality images are required for correct classification. This paper presents an adaptive algorithm for vessel measurements without the necessity to tune the algorithm for concrete imaging equipment or a specific situation. The main novelty of the proposed method is the extraction of blood vessel features based on vessel width measurement algorithm and vessel spatial dependency. Vessel classification accuracy rates of 0.855 and 0.859 are obtained on publicly available eye fundus image databases used for comparison with another state of the art algorithms for vessel classification in order to evaluate artery-vein ratio (<inline-formula id="j_info1205_ineq_001"><alternatives><mml:math>
<mml:mi mathvariant="italic">A</mml:mi>
<mml:mi mathvariant="italic">V</mml:mi>
<mml:mi mathvariant="italic">R</mml:mi></mml:math><tex-math><![CDATA[$AVR$]]></tex-math></alternatives></inline-formula>). The method is also evaluated with images that represent artery and vein size changes before and after physical load. Optomed OY digital mobile eye fundus camera Smartscope M5 PRO is used for image gathering.</p>
</abstract>
<kwd-group>
<label>Key words</label>
<kwd>automatic vessel classification</kwd>
<kwd>vessel measurement</kwd>
<kwd>artery-vein ratio</kwd>
<kwd>eye fundus images</kwd>
</kwd-group>
</article-meta>
</front>
<body>
<sec id="j_info1205_s_001">
<label>1</label>
<title>Introduction</title>
<p>Eye fundus retinal vessels are the only blood vessels of the body that can be visible with non-invasive imaging (Miri <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_010">2017</xref>; Fraz <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_005">2014</xref>). Retinal images are commonly used in manual and in automatic vessel structure analysis. This analysis leads to disease detection which is especially important in the early stages of the disease (Li and Wee, <xref ref-type="bibr" rid="j_info1205_ref_008">2014</xref>). First clinical findings in early diabetic retinopathy are often made in abnormalities of retinal blood vessels (Miri <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_010">2017</xref>). One of these abnormalities is the ratio between the artery and vein diameters (<italic>AVR</italic>). <italic>AVR</italic> is approximately equal to 0.7 for a healthy person (Sun <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_023">2009</xref>). Ratio increases in the evaluation of sick persons, arteries become larger. Blood vessel width is measured for main vessels in an <italic>AVR</italic> measurement region – the circular region from 1.5 to 3 optic nerve disc (<italic>OD</italic>) radius from its centre (Knudtson <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_006">2003</xref>). The ratio is calculated for vessels above and below the <italic>OD</italic> centre separately. There are also some other methods for measuring this ratio, including more vessels for calculations, selecting other places for measurements, etc. (see e.g. Sun <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_023">2009</xref>; Niemeijer <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_015">2011</xref>). The timely evaluation of disorders can lead not only to the better evaluation of the disease but also to protection of disorder progression.</p>
<p>The measurement making and other eye fundus analysis steps are time consuming for an expert, and sometimes ophthalmologists cannot spend much time for analysis if they have many patients. Here, computer aided systems help to do most of the analysis and act as advisory or tracking systems (see, e.g. Li and Wee, <xref ref-type="bibr" rid="j_info1205_ref_008">2014</xref>; Bankhhead <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_001">2012</xref>). When making such systems, there are many specific situations to be considered. For example, when an expert does measurements, the measurement place is selected very differently for the specific image – the most important factor is the ability to make precise measurements. When the computer makes automatic measurements the place where they are done is always concrete, and in many cases, these places are not the best for measurements to make. The automatic system does not know the quality of the image and even whether the person is sick or not. In many cases (Muramatsu <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_013">2011</xref>; Bankhhead <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_001">2012</xref>) computer aided systems are adapted for specific technical equipment or even to the specific image database and when a new type of images is gathered, the system must update to meet the new requirements. Sometimes this leads even to an information loss because of reduced image resolution (Ravishankar <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_017">2009</xref>).</p>
<p>Many eye fundus analysis algorithms are tested on high quality image databases only. However, in some specific circumstances, it is hard to get high quality images, or it is hard to obtain another image if the image is not of high enough quality. Image quality is affected by the illness also. These situations lead to the requirements for the automatic system to be able to deal with images of lower quality.</p>
<p>Classification is required for <italic>AVR</italic> calculations. In order to do this, main vessels must be detected and classified into arteries and veins. Then, with the view to evaluate the artery and vein ratio from eye fundus images at the specific place, the following steps can be applied. For example, <italic>OD</italic> should be detected in order to make measurements at a specific place. Then, places for measurements must be selected.</p>
<p>The <italic>OD</italic> detection is a very important stage in eye fundus analysis (Buteikienė <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_002">2012</xref>). As it was mentioned before, <italic>OD</italic> detection is used for measurement place selection for <italic>AVR</italic> calculations. It is also used in vessel tracing algorithms in disease evaluations based on <italic>OD</italic> analysis like <italic>OD</italic> excavation and in other cases. Main problems arising in precise <italic>OD</italic> detection are uneven lightening, disease based changes in eye fundus and noise produced by vessels inside of <italic>OD</italic>. In this paper, a two stage <italic>OD</italic> detection algorithm is used. This <italic>OD</italic> detection algorithm is based on the algorithm described in Stabingis <italic>et al.</italic> (<xref ref-type="bibr" rid="j_info1205_ref_022">2018</xref>) with several improvements described in the next section.</p>
<p>In the next stage, the vessel width measurement algorithm (Stabingis <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_022">2018</xref>) is applied. Profile analysis is a technique used in vessel measurement (see e.g. Li <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_007">2003</xref>; Bankhhead <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_001">2012</xref>). The vessel width measurement algorithm, used in the proposed algorithm, is based on vessel profile analysis described in the paper (Stabingis <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_022">2018</xref>). Profile information is often extracted according to the Bresenham algorithm which extracts information fast but makes distances between points distributed unevenly. In the proposed algorithm, profile information extraction uses spatial distance function which enables to use the same analysis algorithm with images of different resolution. Measurement algorithm is capable of detecting vessels without prior information about vessel existence at a specific point. The extracted vessel tree is used only for the detection of the initial measurement direction and only when this tree information is available. <italic>AVR</italic> measurement zone vessels are measured, and then vessels classification is applied.</p>
<p>Classification problem is still very significant part in medical image analysis (see e.g. Miri <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_010">2017</xref>; Renukalatha and Suresh, <xref ref-type="bibr" rid="j_info1205_ref_018">2018</xref>; Morkunas <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_012">2018</xref>). Many different classification algorithms are used for vessel classification in eye fundus images (see e.g. Miri <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_010">2017</xref>; Dashtbozorg <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_003">2014</xref>; Sun <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_023">2009</xref>; Niemeijer <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_015">2011</xref>; Treigys <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_024">2008</xref>). If the classification is intended to be done automatically and with images of different type, unsupervised classification methods are more suitable. Supervised classification methods are more accurate than unsupervised methods, but they must be adapted for different type images. Different medical image analysis methods have specific image quality issues (see e.g. Prasath, <xref ref-type="bibr" rid="j_info1205_ref_016">2017</xref>). Eye fundus vessel classification struggles with different or uneven lightening issues. The proposed algorithm uses <italic>k</italic>-means clustering method with spatially recalculated features decreasing the influence of uneven lightening.</p>
<p>There are many different features which are commonly used in vessel classification. These features are often extracted from <italic>RBG</italic>, <italic>HSI</italic>, grey level image information using statistical functions, texture features and others (see e.g. Renukalatha and Suresh, <xref ref-type="bibr" rid="j_info1205_ref_018">2018</xref>; Muramatsu <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_013">2011</xref>; Stabingis <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_021">2016</xref>). Such features are commonly used for many other image analysis problems. New type features, based on the profile information gathered from the vessel width measurement algorithm, are proposed in this paper. These features are extracted from the vessel measurement stage and use information from vessel inner part and average it along the vessel adaptively to the vessel width.</p>
<p>Classification accuracy is compared with other common classification algorithms of an eye fundus blood vessels according to publicly available eye fundus databases. <italic>IVSIPIRE-AVR</italic> (Niemeijer <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_015">2011</xref>) and <italic>DRIVE</italic> (Staal <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_020">2004</xref>) databases are used for comparison. <italic>DRIVE</italic> database is used only for a demonstration that algorithm works with images of any resolution without the need for adaptation. The accuracy results from this database are not important, because such small resolution images are not used any more. The proposed algorithm is compared with algorithms proposed in papers (Dashtbozorg <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_003">2014</xref>; Muramatsu <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_013">2011</xref>; Mirsharif <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_011">2013</xref>; Niemeijer <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_015">2011</xref>). Algorithms used for vessel patch classification are selected only. Algorithms used for the whole vessel tree classification are not used for comparison, because our algorithm is built only for <italic>AVR</italic> evaluation at this stage.</p>
<p>The method described in Muramatsu <italic>et al.</italic> (<xref ref-type="bibr" rid="j_info1205_ref_013">2011</xref>) is used especially for <italic>AVR</italic> evaluation. This method consists of the following stages. First, vessel segmentation method is applied to the retinal image. The segmentation method involves top-hat transformation, double-ring filtering, thresholding techniques and it is performed on green colour channel. Vessel centreline pixels are extracted. According to vessel crossing and bifurcation points, centreline pixels are divided into separate vessel parts. Small vessels (with diameter smaller than 3 pixels) are removed from calculations. Thresholding and ellipse fitting is applied for <italic>OD</italic> detection. The longer ellipse axis is used as <italic>OD</italic> radius for <italic>AVR</italic> measurement zone. Segmentation and <italic>OD</italic> detection parameters were selected for specific image resolution. Centreline pixels from <italic>AVR</italic> measurement zone were used in classification. For each centreline pixel, five features are extracted for classification. These features are <italic>RGB</italic> values corresponding each pixel and contrast values gathered from red and green channels. The contrast features are calculated from red and green channels as the difference in average pixel values in a <inline-formula id="j_info1205_ineq_002"><alternatives><mml:math>
<mml:mn>5</mml:mn>
<mml:mo>×</mml:mo>
<mml:mn>5</mml:mn></mml:math><tex-math><![CDATA[$5\times 5$]]></tex-math></alternatives></inline-formula> pixel region around the pixel of interest inside the vessel, and in a <inline-formula id="j_info1205_ineq_003"><alternatives><mml:math>
<mml:mn>10</mml:mn>
<mml:mo>×</mml:mo>
<mml:mn>10</mml:mn></mml:math><tex-math><![CDATA[$10\times 10$]]></tex-math></alternatives></inline-formula> pixel region around the pixel of interest outside the vessel. <italic>LDA</italic> classifier was used for each centreline pixel classification and each vessel segment is classified into an artery or a vein by selecting a majority class from classified centreline pixels. Then, major vessels are selected for <italic>AVR</italic> evaluation. Vessel measurements are done according to segmented vessels, averaging along the vessel patches. This method was tested on <italic>DRIVE</italic> image database and main results used for comparison are presented in Table <xref rid="j_info1205_tab_002">2</xref>.</p>
<p>The method used in Mirsharif <italic>et al.</italic> (<xref ref-type="bibr" rid="j_info1205_ref_011">2013</xref>) consists of the following steps: image enhancement, vessel segmentation, thinning, feature extraction, artery/vein classification and post processing. Histogram matching, several histogram equalization and multi scale retinex techniques were used for image enhancement. The method proposed in Soares <italic>et al.</italic> (<xref ref-type="bibr" rid="j_info1205_ref_019">2006</xref>) is used for vessel segmentation. This method uses Gabor wavelets. Vessels thinner than three pixels were removed from the vessel network. Vessels are divided into smaller segments and for each segment many features were extracted. Features included information from <italic>RGB</italic>, <italic>HSL</italic> and <italic>LAB</italic> colour spaces and their statistics. From all these features, 8 best discriminating features were selected. <italic>LDA</italic> classifier was used for this method. Then post processing, using structural knowledge, is applied for reclassification of the vessels unconnected to the tree. The method was used for classification of all vessels and for classification of vessels in <italic>AVR</italic> measurement zone. <italic>OD</italic> was marked manually here. <italic>AVR</italic> measurement zone data from <italic>DRIVE</italic> image database is used for comparison.</p>
<p>Method used in Niemeijer <italic>et al.</italic> (<xref ref-type="bibr" rid="j_info1205_ref_015">2011</xref>) consists of the following steps: pre-processing, vessel segmentation and centreline extraction, <italic>OD</italic> detection, vessel width measurement, vessel classification, <italic>AVR</italic> evaluation. Pre-processing involves <italic>FOV</italic> extraction and Gaussian filtering. Vessel segmentation is performed with classification method. This stage was constructed for DRIVE database, and the use with larger images involves reducing the image size. Pixel classification produces likelihood map and threshold produces vessel structure. The method includes “tubogganing” technique from Fairfield (<xref ref-type="bibr" rid="j_info1205_ref_004">1990</xref>). Then, the structure was thinned. A supervised position regression method was used to detect the centre of the <italic>OD</italic> (Niemeijer <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_014">2009</xref>). The <italic>OD</italic> radius was considered as constant for all images. Vessel measurements were done perpendicular to detected local centreline angle. Artery/vein classification was done with <italic>LDA</italic> method, using 27 different features. Features were extracted from different image channels using different statistics. Statistics were applied to vessel centreline pixels and to vessel patches. During this classification stage, vessel likelihood was calculated. Final class label for vessels was obtained only during the <italic>AVR</italic> calculation stage. During this stage, vessels were classified into arteries and veins including prior artery/vein structural knowledge and classified vessel pairs were used for <italic>AVR</italic> measurements. This method was tested with <italic>INSPIRE-AVR</italic> database, which was first introduced in this paper (Niemeijer <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_015">2011</xref>).</p>
<p>Finally, the method (Dashtbozorg <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_003">2014</xref>), which includes graph analysis, is selected for the comparison. This method uses the characteristics of retinal vessel tree. The method described in Mendonca <italic>et al.</italic> (<xref ref-type="bibr" rid="j_info1205_ref_009">2013</xref>) is used for vessel segmentation. Vessel centreline structure is extracted and then used for graph construction. The obtained graph is then modified, to solve common error issues. Graph analysis leads to initial node classification. Then, <italic>LDA</italic> classifier is applied with features extracted from different image channels and with their statistics. 19 features were used for this classification part. Whole vessel tree classification was performed there, also, calculation only for <italic>AVR</italic> measurement region vessels are presented. This method was tested with <italic>DRIVE</italic> and <italic>INSPIRE-AVR</italic> databases.</p>
<p>The proposed algorithm is also tested on the high resolution image database <italic>OPTO-AVR</italic> described in Section 3. Here, images are acquired with Optomed OY digital mobile eye fundus camera Smartscope M5 PRO from soldiers before and after sports load. The database consists of 86 different images.</p>
<p>The structure of this article is as follows. Method implementation chapter describes the proposed method which consists of scale parameter evaluation, <italic>OD</italic> detection, vessel measuring, feature extraction, and classification. Method comparison with other similar methods is presented in the result section. Conclusions and further development opportunities are described in the conclusion section.</p>
</sec>
<sec id="j_info1205_s_002">
<label>2</label>
<title>Method Implementation</title>
<p>Main adaptive features of the proposed method, compared with less adaptive methods are presented in Fig. <xref rid="j_info1205_fig_001">1</xref>. The proposed method is adaptive to different size images, no image downsampling is applied. Any parameters don’t need to be tuned manually for different image data sets, no different models, with different parameters are created. Image noise and uneven lightening of the image is considered while creating the algorithm. All main steps: OD detection, blood vessel measurements, artery/vein classification and AVR evaluation, are done automatically.</p>
<fig id="j_info1205_fig_001">
<label>Fig. 1</label>
<caption>
<p>Main adaptive features of the proposed method, compared with steps from less adaptive methods.</p>
</caption>
<graphic xlink:href="info1205_g001.jpg"/>
</fig>
<p>Main steps of an automatic algorithm for evaluation of artery and vein ratio are presented in Fig. <xref rid="j_info1205_fig_002">2</xref>. Green colour channel <inline-formula id="j_info1205_ineq_004"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">I</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">G</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${I_{G}}$]]></tex-math></alternatives></inline-formula> from eye fundus image is used for eye fundus mask <inline-formula id="j_info1205_ineq_005"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">I</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">M</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${I_{M}}$]]></tex-math></alternatives></inline-formula> extraction. For mask extraction, common image analysis techniques are applied. After thresholding, the higher intensity part from <inline-formula id="j_info1205_ineq_006"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">I</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">G</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${I_{G}}$]]></tex-math></alternatives></inline-formula>, mathematical morphology closing and flood fill operations are used to form a circular mask of eye fundus part in the image. <inline-formula id="j_info1205_ineq_007"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">I</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">M</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${I_{M}}$]]></tex-math></alternatives></inline-formula> is then used to calculate the scale parameter <inline-formula id="j_info1205_ineq_008"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">p</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">s</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${p_{s}}$]]></tex-math></alternatives></inline-formula> according to Eq. (<xref rid="j_info1205_eq_001">1</xref>) combining logistic growth model with a linear model. 
<disp-formula id="j_info1205_eq_001">
<label>(1)</label><alternatives><mml:math display="block">
<mml:mtable displaystyle="true">
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">p</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">s</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo><mml:mstyle displaystyle="true">
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">β</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mrow>
<mml:mn>1</mml:mn>
<mml:mo>+</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">β</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn>2</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo movablelimits="false">exp</mml:mo>
<mml:mo maxsize="1.19em" minsize="1.19em" fence="true" mathvariant="normal">(</mml:mo>
<mml:mo>−</mml:mo><mml:mstyle displaystyle="false">
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">W</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">M</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">β</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn>3</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
</mml:mstyle>
<mml:mo maxsize="1.19em" minsize="1.19em" fence="true" mathvariant="normal">)</mml:mo>
</mml:mrow>
</mml:mfrac>
</mml:mstyle>
<mml:mo>+</mml:mo><mml:mstyle displaystyle="true">
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">W</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">M</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">β</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn>4</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
</mml:mstyle>
<mml:mo mathvariant="normal">,</mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable></mml:math><tex-math><![CDATA[\[ {p_{s}}=\frac{{\beta _{1}}}{1+{\beta _{2}}\exp \big(-\frac{{W_{M}}}{{\beta _{3}}}\big)}+\frac{{W_{M}}}{{\beta _{4}}},\]]]></tex-math></alternatives>
</disp-formula> 
where <inline-formula id="j_info1205_ineq_009"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">β</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${\beta _{1}}$]]></tex-math></alternatives></inline-formula>, <inline-formula id="j_info1205_ineq_010"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">β</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn>2</mml:mn>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${\beta _{2}}$]]></tex-math></alternatives></inline-formula>, <inline-formula id="j_info1205_ineq_011"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">β</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn>3</mml:mn>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${\beta _{3}}$]]></tex-math></alternatives></inline-formula> are parameters of the logistic growth model part, <inline-formula id="j_info1205_ineq_012"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">β</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn>4</mml:mn>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${\beta _{4}}$]]></tex-math></alternatives></inline-formula> is the parameter of the linear model part (<inline-formula id="j_info1205_ineq_013"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">β</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mn>4</mml:mn></mml:math><tex-math><![CDATA[${\beta _{1}}=4$]]></tex-math></alternatives></inline-formula>, <inline-formula id="j_info1205_ineq_014"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">β</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn>2</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mn>20</mml:mn></mml:math><tex-math><![CDATA[${\beta _{2}}=20$]]></tex-math></alternatives></inline-formula>, <inline-formula id="j_info1205_ineq_015"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">β</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn>3</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mn>500</mml:mn></mml:math><tex-math><![CDATA[${\beta _{3}}=500$]]></tex-math></alternatives></inline-formula> and <inline-formula id="j_info1205_ineq_016"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">β</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn>4</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mn>3000</mml:mn></mml:math><tex-math><![CDATA[${\beta _{4}}=3000$]]></tex-math></alternatives></inline-formula> in our case), <inline-formula id="j_info1205_ineq_017"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">W</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">M</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${W_{M}}$]]></tex-math></alternatives></inline-formula> is the width of fundus mask <inline-formula id="j_info1205_ineq_018"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">I</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">M</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${I_{M}}$]]></tex-math></alternatives></inline-formula>. Parameter <inline-formula id="j_info1205_ineq_019"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">p</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">s</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${p_{s}}$]]></tex-math></alternatives></inline-formula>, evaluated by Eq. (<xref rid="j_info1205_eq_001">1</xref>), is used in almost all steps of the algorithm. This parameter allows to use images of different sizes. Eq. (<xref rid="j_info1205_eq_001">1</xref>) is a modified parameter version from Stabingis <italic>et al.</italic> (<xref ref-type="bibr" rid="j_info1205_ref_022">2018</xref>) and includes adaptation for different <italic>FOV</italic> values of images. Newer fundus cameras acquire images with larger resolution and also use optics with larger <italic>FOV</italic> value. The obtained mask <inline-formula id="j_info1205_ineq_020"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">I</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">M</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${I_{M}}$]]></tex-math></alternatives></inline-formula> is further used in other steps of the algorithm like norming operation and others.</p>
<fig id="j_info1205_fig_002">
<label>Fig. 2</label>
<caption>
<p>Main steps of the proposed method for automatic <italic>AVR</italic> evaluation.</p>
</caption>
<graphic xlink:href="info1205_g002.jpg"/>
</fig>
<p>Image pre-processing allows to unify different images and to reveal some image features required for specific operations. Image pre-processing is treated as a vital stage for many eye fundus image analysis methods. The proposed method consists of several different large stages (see Fig. <xref rid="j_info1205_fig_002">2</xref>.), and for each stage, a different image pre-processing algorithm is applied. Pre-processing allows controlling different stages separately because some methods can enhance the extraction of the blood vessel tree but reduce <italic>OD</italic> extraction reliability. Vessel tree <inline-formula id="j_info1205_ineq_021"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">I</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">t</mml:mi>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${I_{tr}}$]]></tex-math></alternatives></inline-formula> extraction stage is not very important in the proposed algorithm, because the <inline-formula id="j_info1205_ineq_022"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">I</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">t</mml:mi>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${I_{tr}}$]]></tex-math></alternatives></inline-formula> information is used in other stages only as the auxiliary. In some other eye fundus analysis algorithms, <inline-formula id="j_info1205_ineq_023"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">I</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">t</mml:mi>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${I_{tr}}$]]></tex-math></alternatives></inline-formula> extraction is very important, and the extracted tree is used as the reference for almost all other stages. Such dependence on extracted <inline-formula id="j_info1205_ineq_024"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">I</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">t</mml:mi>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${I_{tr}}$]]></tex-math></alternatives></inline-formula> can lead to some errors, especially for sick patients, when some artefacts in the image can be treated as blood vessels or when vessel and background contrast is very low, producing breakage in a vessel tree. There are several different technics for <inline-formula id="j_info1205_ineq_025"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">I</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">t</mml:mi>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${I_{tr}}$]]></tex-math></alternatives></inline-formula> extraction like thresholding, mathematical morphology, Gabor filtering, wavelets, etc. More precise methods are also developed. These methods produce higher extraction accuracy but require a larger amount of computations. Mathematical morphology based method is applied here and is described in more detail in Stabingis <italic>et al.</italic> (<xref ref-type="bibr" rid="j_info1205_ref_022">2018</xref>). All other stages work even without the extracted <inline-formula id="j_info1205_ineq_026"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">I</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">t</mml:mi>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${I_{tr}}$]]></tex-math></alternatives></inline-formula> and in places where <inline-formula id="j_info1205_ineq_027"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">I</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">t</mml:mi>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${I_{tr}}$]]></tex-math></alternatives></inline-formula> is incorrectly extracted.</p>
<sec id="j_info1205_s_003">
<label>2.1</label>
<title>Optic Nerve Disc Extraction</title>
<p><italic>OD</italic> extraction algorithm consists of two stages. Initial <inline-formula id="j_info1205_ineq_028"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">OD</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">init</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${\mathit{OD}_{\mathit{init}}}$]]></tex-math></alternatives></inline-formula> is located at the first stage, and then real <inline-formula id="j_info1205_ineq_029"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">OD</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${\mathit{OD}_{r}}$]]></tex-math></alternatives></inline-formula> circle is detected and evaluated. During the second stage, the circle is fitted to detect a circle like an object in the image. Full <italic>OD</italic> detection scheme is presented in Fig. <xref rid="j_info1205_fig_003">3</xref>. Main <italic>OD</italic> detection steps are described in more detail in Stabingis <italic>et al.</italic> (<xref ref-type="bibr" rid="j_info1205_ref_022">2018</xref>).</p>
<fig id="j_info1205_fig_003">
<label>Fig. 3</label>
<caption>
<p>Two stage OD detection scheme. Green dashed lines show the influence of scale parameter, used for image size adaptation.</p>
</caption>
<graphic xlink:href="info1205_g003.jpg"/>
</fig>
<p>First <inline-formula id="j_info1205_ineq_030"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">OD</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">init</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${\mathit{OD}_{\mathit{init}}}$]]></tex-math></alternatives></inline-formula> detection stage detects the preliminary location of <italic>OD</italic>. During this stage the probability map <inline-formula id="j_info1205_ineq_031"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">I</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">pr</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${I_{\mathit{pr}}}$]]></tex-math></alternatives></inline-formula> is created. This probability map combines the influence of five different probabilities: image information part <inline-formula id="j_info1205_ineq_032"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">I</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">pr</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${I_{\mathit{pr}}}$]]></tex-math></alternatives></inline-formula>, vessel tree line intersection map (Stabingis <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_022">2018</xref>) multiplied by 0.2, vertical centre probability map multiplied by 0.2, vertical gradient map multiplied by −0.4, and the horizontal gradient map multiplied by −0.3. <inline-formula id="j_info1205_ineq_033"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">I</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">pr</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${I_{\mathit{pr}}}$]]></tex-math></alternatives></inline-formula> is used for finding the most intense circle <inline-formula id="j_info1205_ineq_034"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">OD</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">init</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${\mathit{OD}_{\mathit{init}}}$]]></tex-math></alternatives></inline-formula> with a radius from <inline-formula id="j_info1205_ineq_035"><alternatives><mml:math>
<mml:mn>80</mml:mn>
<mml:mo>·</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">p</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">s</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[$80\cdot {p_{s}}$]]></tex-math></alternatives></inline-formula> to <inline-formula id="j_info1205_ineq_036"><alternatives><mml:math>
<mml:mn>200</mml:mn>
<mml:mo>·</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">p</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">s</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[$200\cdot {p_{s}}$]]></tex-math></alternatives></inline-formula> intersecting with <inline-formula id="j_info1205_ineq_037"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">I</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">t</mml:mi>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${I_{tr}}$]]></tex-math></alternatives></inline-formula>.</p>
<p>Normally eye fundus image is photographed in such a way that <italic>OD</italic> is located near the vertical centre of the image. In order to add this feature, vertical centre probability map is calculated according to Eq. (<xref rid="j_info1205_eq_002">2</xref>). 
<disp-formula id="j_info1205_eq_002">
<label>(2)</label><alternatives><mml:math display="block">
<mml:mtable displaystyle="true">
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">I</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">cvert</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo mathvariant="normal" fence="true" stretchy="false">(</mml:mo>
<mml:mi mathvariant="italic">i</mml:mi>
<mml:mo mathvariant="normal">,</mml:mo>
<mml:mi mathvariant="italic">j</mml:mi>
<mml:mo mathvariant="normal" fence="true" stretchy="false">)</mml:mo>
<mml:mo>=</mml:mo><mml:mstyle displaystyle="true">
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">β</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">v</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mi mathvariant="italic">h</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:msqrt>
<mml:mrow>
<mml:mn>2</mml:mn>
<mml:mi mathvariant="italic">π</mml:mi>
<mml:msup>
<mml:mrow>
<mml:mi mathvariant="italic">σ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn>2</mml:mn>
</mml:mrow>
</mml:msup>
</mml:mrow>
</mml:msqrt>
</mml:mrow>
</mml:mfrac>
</mml:mstyle>
<mml:mo movablelimits="false">exp</mml:mo>
<mml:mo maxsize="2.03em" minsize="2.03em" fence="true" mathvariant="normal">(</mml:mo>
<mml:mo>−</mml:mo><mml:mstyle displaystyle="true">
<mml:mfrac>
<mml:mrow>
<mml:mo mathvariant="normal" fence="true" stretchy="false">(</mml:mo>
<mml:mi mathvariant="italic">i</mml:mi>
<mml:mo>−</mml:mo><mml:mstyle displaystyle="false">
<mml:mfrac>
<mml:mrow>
<mml:mi mathvariant="italic">h</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn>2</mml:mn>
</mml:mrow>
</mml:mfrac>
</mml:mstyle>
<mml:mo mathvariant="normal" fence="true" stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mn>2</mml:mn>
<mml:mo>·</mml:mo>
<mml:msup>
<mml:mrow>
<mml:mi mathvariant="italic">σ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn>2</mml:mn>
</mml:mrow>
</mml:msup>
</mml:mrow>
</mml:mfrac>
</mml:mstyle>
<mml:mo maxsize="2.03em" minsize="2.03em" fence="true" mathvariant="normal">)</mml:mo>
<mml:mo mathvariant="normal">,</mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable></mml:math><tex-math><![CDATA[\[ {I_{\mathit{cvert}}}(i,j)=\frac{{\beta _{v}}h}{\sqrt{2\pi {\sigma ^{2}}}}\exp \bigg(-\frac{(i-\frac{h}{2})}{2\cdot {\sigma ^{2}}}\bigg),\]]]></tex-math></alternatives>
</disp-formula> 
where <italic>h</italic> is <inline-formula id="j_info1205_ineq_038"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">I</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">I</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${I_{I}}$]]></tex-math></alternatives></inline-formula> image height, <inline-formula id="j_info1205_ineq_039"><alternatives><mml:math>
<mml:msup>
<mml:mrow>
<mml:mi mathvariant="italic">σ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn>2</mml:mn>
</mml:mrow>
</mml:msup>
<mml:mo>=</mml:mo>
<mml:msup>
<mml:mrow>
<mml:mo mathvariant="normal" fence="true" stretchy="false">(</mml:mo><mml:mstyle displaystyle="false">
<mml:mfrac>
<mml:mrow>
<mml:mi mathvariant="italic">h</mml:mi>
<mml:mo mathvariant="normal" stretchy="false">/</mml:mo>
<mml:mn>2</mml:mn>
</mml:mrow>
<mml:mrow>
<mml:mn>3</mml:mn>
</mml:mrow>
</mml:mfrac>
</mml:mstyle>
<mml:mo mathvariant="normal" fence="true" stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mn>2</mml:mn>
</mml:mrow>
</mml:msup></mml:math><tex-math><![CDATA[${\sigma ^{2}}={(\frac{h/2}{3})^{2}}$]]></tex-math></alternatives></inline-formula>, <inline-formula id="j_info1205_ineq_040"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">β</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">v</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${\beta _{v}}$]]></tex-math></alternatives></inline-formula> is intensity control parameter for vertical probability map (<inline-formula id="j_info1205_ineq_041"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">β</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">v</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mn>100</mml:mn></mml:math><tex-math><![CDATA[${\beta _{v}}=100$]]></tex-math></alternatives></inline-formula> in our case), <italic>i</italic> is image line number and <italic>j</italic> is image column number. To eliminate uneven lightening vertical and horizontal gradient maps are calculated. A vertical gradient map is calculated by averaging every image row, and a horizontal gradient map is calculated by averaging every image column.</p>
<p>The region of interest is selected according to <inline-formula id="j_info1205_ineq_042"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">OD</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">init</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${\mathit{OD}_{\mathit{init}}}$]]></tex-math></alternatives></inline-formula> circle multiplied by 2.5, and it is used for real <inline-formula id="j_info1205_ineq_043"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">OD</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${\mathit{OD}_{r}}$]]></tex-math></alternatives></inline-formula> detection (Stabingis <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_022">2018</xref>). After several image analysis and processing operations (see Fig. <xref rid="j_info1205_fig_003">3</xref>) Hough circle detection is applied, and ten best circles are selected. A circle surrounded by most points from <inline-formula id="j_info1205_ineq_044"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">I</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">b</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${I_{b}}$]]></tex-math></alternatives></inline-formula> is selected as <inline-formula id="j_info1205_ineq_045"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">OD</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${\mathit{OD}_{r}}$]]></tex-math></alternatives></inline-formula> and is used in further analysis.</p>
</sec>
<sec id="j_info1205_s_004">
<label>2.2</label>
<title>Blood Vessel Measurements</title>
<p>After <inline-formula id="j_info1205_ineq_046"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">OD</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${\mathit{OD}_{r}}$]]></tex-math></alternatives></inline-formula> is detected, the information is used for vessel measurements. Measurements are done along the circle with radius <inline-formula id="j_info1205_ineq_047"><alternatives><mml:math>
<mml:mn>2</mml:mn>
<mml:mo>·</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">OD</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[$2\cdot {r_{\mathit{OD}}}$]]></tex-math></alternatives></inline-formula> from <inline-formula id="j_info1205_ineq_048"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">OD</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${\mathit{OD}_{r}}$]]></tex-math></alternatives></inline-formula> centre (<inline-formula id="j_info1205_ineq_049"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">x</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">OD</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${x_{\mathit{OD}}}$]]></tex-math></alternatives></inline-formula>, <inline-formula id="j_info1205_ineq_050"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">y</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">OD</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${y_{\mathit{OD}}}$]]></tex-math></alternatives></inline-formula>), where <inline-formula id="j_info1205_ineq_051"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">OD</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${r_{\mathit{OD}}}$]]></tex-math></alternatives></inline-formula> is <inline-formula id="j_info1205_ineq_052"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">OD</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${\mathit{OD}_{r}}$]]></tex-math></alternatives></inline-formula> radius. Measurements are done at five pixel length intervals calculated according to Eq. (<xref rid="j_info1205_eq_003">3</xref>). 
<disp-formula id="j_info1205_eq_003">
<label>(3)</label><alternatives><mml:math display="block">
<mml:mtable displaystyle="true">
<mml:mtr>
<mml:mtd>
<mml:mfenced separators="" open="{" close="">
<mml:mrow>
<mml:mtable equalrows="false" equalcolumns="false" columnalign="left">
<mml:mtr>
<mml:mtd class="array">
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">x</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">x</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">OD</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>+</mml:mo>
<mml:mo maxsize="1.19em" minsize="1.19em" fence="true" mathvariant="normal">(</mml:mo>
<mml:mn>2</mml:mn>
<mml:mo>·</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">OD</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>·</mml:mo>
<mml:mo movablelimits="false">cos</mml:mo>
<mml:mo mathvariant="normal" fence="true" stretchy="false">(</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">α</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo mathvariant="normal" fence="true" stretchy="false">)</mml:mo>
<mml:mo maxsize="1.19em" minsize="1.19em" fence="true" mathvariant="normal">)</mml:mo>
<mml:mo mathvariant="normal">,</mml:mo>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd class="array">
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">y</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">y</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">OD</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>+</mml:mo>
<mml:mo maxsize="1.19em" minsize="1.19em" fence="true" mathvariant="normal">(</mml:mo>
<mml:mn>2</mml:mn>
<mml:mo>·</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">OD</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>·</mml:mo>
<mml:mo movablelimits="false">sin</mml:mo>
<mml:mo mathvariant="normal" fence="true" stretchy="false">(</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">α</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo mathvariant="normal" fence="true" stretchy="false">)</mml:mo>
<mml:mo maxsize="1.19em" minsize="1.19em" fence="true" mathvariant="normal">)</mml:mo>
<mml:mo mathvariant="normal">,</mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:mfenced>
</mml:mtd>
</mml:mtr>
</mml:mtable></mml:math><tex-math><![CDATA[\[ \left\{\begin{array}{l}{x_{i}}={x_{\mathit{OD}}}+\big(2\cdot {r_{\mathit{OD}}}\cdot \cos ({\alpha _{i}})\big),\\ {} {y_{i}}={y_{\mathit{OD}}}+\big(2\cdot {r_{\mathit{OD}}}\cdot \sin ({\alpha _{i}})\big),\end{array}\right.\]]]></tex-math></alternatives>
</disp-formula> 
where <inline-formula id="j_info1205_ineq_053"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">α</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">α</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
<mml:mo>−</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo>+</mml:mo><mml:mstyle displaystyle="false">
<mml:mfrac>
<mml:mrow>
<mml:mn>5</mml:mn>
</mml:mrow>
<mml:mrow>
<mml:mn>2</mml:mn>
<mml:mo>·</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">OD</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
</mml:mstyle></mml:math><tex-math><![CDATA[${\alpha _{i}}={\alpha _{i-1}}+\frac{5}{2\cdot {r_{\mathit{OD}}}}$]]></tex-math></alternatives></inline-formula>, <inline-formula id="j_info1205_ineq_054"><alternatives><mml:math>
<mml:mi mathvariant="italic">a</mml:mi>
<mml:mi mathvariant="italic">l</mml:mi>
<mml:mi mathvariant="italic">p</mml:mi>
<mml:mi mathvariant="italic">h</mml:mi>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">a</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mn>0</mml:mn></mml:math><tex-math><![CDATA[$alph{a_{i}}=0$]]></tex-math></alternatives></inline-formula> and <inline-formula id="j_info1205_ineq_055"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">α</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo mathvariant="normal">&lt;</mml:mo>
<mml:mn>2</mml:mn>
<mml:mi mathvariant="italic">π</mml:mi></mml:math><tex-math><![CDATA[${\alpha _{1}}<2\pi $]]></tex-math></alternatives></inline-formula>. Algorithm proposed in Stabingis <italic>et al.</italic> (<xref ref-type="bibr" rid="j_info1205_ref_022">2018</xref>) is used for vessel width measurements. The algorithm automatically detects the place where the vessel can be. After the vessel is detected and measured the algorithm goes along the detected vessel and measures other profile widths. Vessel data is collected for every vessel in the range of radiuses from <inline-formula id="j_info1205_ineq_056"><alternatives><mml:math>
<mml:mn>1.5</mml:mn>
<mml:mo>·</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">OD</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[$1.5\cdot {r_{\mathit{OD}}}$]]></tex-math></alternatives></inline-formula> to <inline-formula id="j_info1205_ineq_057"><alternatives><mml:math>
<mml:mn>3</mml:mn>
<mml:mo>·</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">OD</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[$3\cdot {r_{\mathit{OD}}}$]]></tex-math></alternatives></inline-formula> (see Fig. <xref rid="j_info1205_fig_005">5</xref>). Average vessel width value is used for further calculations. Measurements are omitted at places where the vessel is already measured. Vessel measurements are done by analysing vessel profile data. Profile information is gathered from the image using spatial distance based function Eq. (<xref rid="j_info1205_eq_004">4</xref>). For every profile, 100 intensity values are calculated from <italic>J</italic> nearest pixels. 
<disp-formula id="j_info1205_eq_004">
<label>(4)</label><alternatives><mml:math display="block">
<mml:mtable displaystyle="true">
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">v</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">j</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo><mml:mstyle displaystyle="true">
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mo largeop="false" movablelimits="false">∑</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">s</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">j</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo stretchy="false">∈</mml:mo>
<mml:mi mathvariant="italic">J</mml:mi>
</mml:mrow>
</mml:msub>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">z</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>·</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">h</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
<mml:mi mathvariant="italic">j</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mo largeop="false" movablelimits="false">∑</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">s</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">j</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo stretchy="false">∈</mml:mo>
<mml:mi mathvariant="italic">J</mml:mi>
</mml:mrow>
</mml:msub>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">h</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
<mml:mi mathvariant="italic">j</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
</mml:mstyle>
<mml:mo mathvariant="normal">,</mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable></mml:math><tex-math><![CDATA[\[ {v_{j}}=\frac{{\textstyle\sum _{{s_{j}}\in J}}{z_{i}}\cdot {h_{ij}}}{{\textstyle\sum _{{s_{j}}\in J}}{h_{ij}}},\]]]></tex-math></alternatives>
</disp-formula> 
where <inline-formula id="j_info1205_ineq_058"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">v</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">j</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${v_{j}}$]]></tex-math></alternatives></inline-formula> is the intensity value for step <italic>j</italic> at location <inline-formula id="j_info1205_ineq_059"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">s</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">j</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${s_{j}}$]]></tex-math></alternatives></inline-formula>, <inline-formula id="j_info1205_ineq_060"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">h</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
<mml:mi mathvariant="italic">j</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mn>1</mml:mn>
<mml:mo>−</mml:mo><mml:mstyle displaystyle="false">
<mml:mfrac>
<mml:mrow>
<mml:mi mathvariant="italic">d</mml:mi>
<mml:mo mathvariant="normal" fence="true" stretchy="false">(</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">s</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo mathvariant="normal">,</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">s</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">j</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo mathvariant="normal" fence="true" stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mn>3</mml:mn>
</mml:mrow>
</mml:mfrac>
</mml:mstyle></mml:math><tex-math><![CDATA[${h_{ij}}=1-\frac{d({s_{i}},{s_{j}})}{3}$]]></tex-math></alternatives></inline-formula> is the distance function, for <inline-formula id="j_info1205_ineq_061"><alternatives><mml:math>
<mml:mi mathvariant="italic">d</mml:mi>
<mml:mo mathvariant="normal" fence="true" stretchy="false">(</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">s</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo mathvariant="normal">,</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">s</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">j</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo mathvariant="normal" fence="true" stretchy="false">)</mml:mo>
<mml:mo mathvariant="normal">&lt;</mml:mo>
<mml:mn>3</mml:mn></mml:math><tex-math><![CDATA[$d({s_{i}},{s_{j}})<3$]]></tex-math></alternatives></inline-formula>, <inline-formula id="j_info1205_ineq_062"><alternatives><mml:math>
<mml:mi mathvariant="italic">d</mml:mi>
<mml:mo mathvariant="normal" fence="true" stretchy="false">(</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">s</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo mathvariant="normal">,</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">s</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">j</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo mathvariant="normal" fence="true" stretchy="false">)</mml:mo></mml:math><tex-math><![CDATA[$d({s_{i}},{s_{j}})$]]></tex-math></alternatives></inline-formula> is the Euclidean distance between <inline-formula id="j_info1205_ineq_063"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">s</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${s_{i}}$]]></tex-math></alternatives></inline-formula> and <inline-formula id="j_info1205_ineq_064"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">s</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">j</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${s_{j}}$]]></tex-math></alternatives></inline-formula> points. Profile information is smoothed with Gaussian filter and analysed for width evaluation. Profile analysis points are shown in Fig. <xref rid="j_info1205_fig_004">4</xref>. Profile minimum point is found near the profile centre. Then, another close minimum point is founded in order to eliminate possible central reflex information. Two points <inline-formula id="j_info1205_ineq_065"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">a</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">l</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${a_{l}}$]]></tex-math></alternatives></inline-formula> and <inline-formula id="j_info1205_ineq_066"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">a</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${a_{r}}$]]></tex-math></alternatives></inline-formula> are used as left and right profile parts. Local maximum points <inline-formula id="j_info1205_ineq_067"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">b</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">l</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${b_{l}}$]]></tex-math></alternatives></inline-formula> and <inline-formula id="j_info1205_ineq_068"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">b</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${b_{r}}$]]></tex-math></alternatives></inline-formula> are found. Then fastest decaying points <inline-formula id="j_info1205_ineq_069"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">c</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">l</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${c_{l}}$]]></tex-math></alternatives></inline-formula> and <inline-formula id="j_info1205_ineq_070"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">c</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${c_{r}}$]]></tex-math></alternatives></inline-formula> are detected. Similarly to <inline-formula id="j_info1205_ineq_071"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">c</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">l</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${c_{l}}$]]></tex-math></alternatives></inline-formula> and <inline-formula id="j_info1205_ineq_072"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">c</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${c_{r}}$]]></tex-math></alternatives></inline-formula>, decaying marginal points <italic>d</italic> are selected for left and right sides. Middle points <inline-formula id="j_info1205_ineq_073"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">e</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">l</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${e_{l}}$]]></tex-math></alternatives></inline-formula> (between <inline-formula id="j_info1205_ineq_074"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">d</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">l</mml:mi>
<mml:mi mathvariant="italic">b</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${d_{lb}}$]]></tex-math></alternatives></inline-formula> and <inline-formula id="j_info1205_ineq_075"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">d</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">l</mml:mi>
<mml:mi mathvariant="italic">t</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${d_{lt}}$]]></tex-math></alternatives></inline-formula>) and <inline-formula id="j_info1205_ineq_076"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">e</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${e_{r}}$]]></tex-math></alternatives></inline-formula> (between <inline-formula id="j_info1205_ineq_077"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">d</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">r</mml:mi>
<mml:mi mathvariant="italic">b</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${d_{rb}}$]]></tex-math></alternatives></inline-formula> and <inline-formula id="j_info1205_ineq_078"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">d</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">r</mml:mi>
<mml:mi mathvariant="italic">t</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${d_{rt}}$]]></tex-math></alternatives></inline-formula>) are selected as vessel width measurement points.</p>
<fig id="j_info1205_fig_004">
<label>Fig. 4</label>
<caption>
<p>Blood vessel profile analysis.</p>
</caption>
<graphic xlink:href="info1205_g004.jpg"/>
</fig>
<p>Such profile analysis is performed for profiles extracted for point (<inline-formula id="j_info1205_ineq_079"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">x</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${x_{i}}$]]></tex-math></alternatives></inline-formula>, <inline-formula id="j_info1205_ineq_080"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">y</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${y_{i}}$]]></tex-math></alternatives></inline-formula>) at different angles and the angle with the smallest profile width is selected as profile perpendicular to vessel. For the first measurement of the vessel, angles from <inline-formula id="j_info1205_ineq_081"><alternatives><mml:math>
<mml:msup>
<mml:mrow>
<mml:mn>0</mml:mn>
</mml:mrow>
<mml:mrow>
<mml:mo>∘</mml:mo>
</mml:mrow>
</mml:msup></mml:math><tex-math><![CDATA[${0^{\circ }}$]]></tex-math></alternatives></inline-formula> to <inline-formula id="j_info1205_ineq_082"><alternatives><mml:math>
<mml:msup>
<mml:mrow>
<mml:mn>180</mml:mn>
</mml:mrow>
<mml:mrow>
<mml:mo>∘</mml:mo>
</mml:mrow>
</mml:msup></mml:math><tex-math><![CDATA[${180^{\circ }}$]]></tex-math></alternatives></inline-formula> are analysed, and the detected angle is used for further vessel measurements decreasing the amount of different angles to be analysed. Vessel measurement algorithm measures vessels similar to the expert measurements and is presented in more detail in Stabingis <italic>et al.</italic> (<xref ref-type="bibr" rid="j_info1205_ref_022">2018</xref>).</p>
</sec>
<sec id="j_info1205_s_005">
<label>2.3</label>
<title>Feature Extraction and Classification</title>
<p>Eye fundus vessel classification is a complicated task. For the proposed algorithm, novel features are extracted for classification. After analysing many different vessels from different type images it was considered that the best part for discriminating veins from arteries is the vessels inner part. Usually veins are darker than the arteries and arteries often have a brighter centre reflex line. The outer part of different type vessels can be very similar. Three different features are extracted from vessels for classification. These features are an average value between <inline-formula id="j_info1205_ineq_083"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">a</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">l</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${a_{l}}$]]></tex-math></alternatives></inline-formula>, <inline-formula id="j_info1205_ineq_084"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">a</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${a_{r}}$]]></tex-math></alternatives></inline-formula> points, an average value between <inline-formula id="j_info1205_ineq_085"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">d</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">l</mml:mi>
<mml:mi mathvariant="italic">b</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${d_{lb}}$]]></tex-math></alternatives></inline-formula>, <inline-formula id="j_info1205_ineq_086"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">d</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">r</mml:mi>
<mml:mi mathvariant="italic">b</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${d_{rb}}$]]></tex-math></alternatives></inline-formula> points and an average value between <inline-formula id="j_info1205_ineq_087"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">d</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">l</mml:mi>
<mml:mi mathvariant="italic">b</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${d_{lb}}$]]></tex-math></alternatives></inline-formula>, <inline-formula id="j_info1205_ineq_088"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">a</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">l</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${a_{l}}$]]></tex-math></alternatives></inline-formula>, and <inline-formula id="j_info1205_ineq_089"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">a</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${a_{r}}$]]></tex-math></alternatives></inline-formula>, <inline-formula id="j_info1205_ineq_090"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">d</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">r</mml:mi>
<mml:mi mathvariant="italic">b</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${d_{rb}}$]]></tex-math></alternatives></inline-formula> points (see Fig. <xref rid="j_info1205_fig_004">4</xref>). Features are calculated from green image channel <inline-formula id="j_info1205_ineq_091"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">I</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">G</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${I_{G}}$]]></tex-math></alternatives></inline-formula> and from red channel <inline-formula id="j_info1205_ineq_092"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">I</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">R</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${I_{R}}$]]></tex-math></alternatives></inline-formula> separately. These two channels are used for their best discrimination information. Average feature value is calculated along the vessel part. So, for every detected vessel, six different averaged features are calculated.</p>
<p>As it was mentioned in the introduction chapter, uneven image lightening can lead to misclassification of vessels. In order to eliminate this influence, spatial distance based normalization function Eq. (<xref rid="j_info1205_eq_005">5</xref>) is applied to extracted features. <disp-formula-group id="j_info1205_dg_001">
<disp-formula id="j_info1205_eq_005">
<label>(5)</label><alternatives><mml:math display="block">
<mml:mtable displaystyle="true" columnalign="left">
<mml:mtr>
<mml:mtd class="align-odd">
<mml:msubsup>
<mml:mrow>
<mml:mi mathvariant="italic">z</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:msup>
<mml:mrow/>
<mml:mrow>
<mml:mo>′</mml:mo>
</mml:mrow>
</mml:msup>
</mml:mrow>
</mml:msubsup>
<mml:mo>=</mml:mo><mml:mstyle displaystyle="true">
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">z</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>−</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:mi mathvariant="italic">μ</mml:mi>
</mml:mrow>
<mml:mo stretchy="false">ˆ</mml:mo></mml:mover>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">d</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:mi mathvariant="italic">σ</mml:mi>
</mml:mrow>
<mml:mo stretchy="false">ˆ</mml:mo></mml:mover>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">d</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
</mml:mstyle>
<mml:mo mathvariant="normal">,</mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable></mml:math><tex-math><![CDATA[\[ {z_{i}^{{^{\prime }}}}=\frac{{z_{i}}-{\hat{\mu }_{d}}}{{\hat{\sigma }_{d}}},\]]]></tex-math></alternatives>
</disp-formula>
<disp-formula id="j_info1205_eq_006">
<label>(6)</label><alternatives><mml:math display="block">
<mml:mtable displaystyle="true" columnalign="left">
<mml:mtr>
<mml:mtd class="align-odd">
<mml:msub>
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:mi mathvariant="italic">μ</mml:mi>
</mml:mrow>
<mml:mo stretchy="false">ˆ</mml:mo></mml:mover>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">d</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo><mml:mstyle displaystyle="true">
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mo largeop="false" movablelimits="false">∑</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
<mml:mo stretchy="false">∈</mml:mo>
<mml:mi mathvariant="italic">ω</mml:mi>
</mml:mrow>
</mml:msub><mml:mstyle displaystyle="false">
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">z</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">d</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
</mml:mstyle>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mo largeop="false" movablelimits="false">∑</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
<mml:mo stretchy="false">∈</mml:mo>
<mml:mi mathvariant="italic">ω</mml:mi>
</mml:mrow>
</mml:msub><mml:mstyle displaystyle="false">
<mml:mfrac>
<mml:mrow>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">d</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
</mml:mstyle>
</mml:mrow>
</mml:mfrac>
</mml:mstyle>
<mml:mo mathvariant="normal">,</mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable></mml:math><tex-math><![CDATA[\[ {\hat{\mu }_{d}}=\frac{{\textstyle\sum _{i\in \omega }}\frac{{z_{i}}}{{d_{i}}}}{{\textstyle\sum _{i\in \omega }}\frac{1}{{d_{i}}}},\]]]></tex-math></alternatives>
</disp-formula>
<disp-formula id="j_info1205_eq_007">
<label>(7)</label><alternatives><mml:math display="block">
<mml:mtable displaystyle="true" columnalign="left">
<mml:mtr>
<mml:mtd class="align-odd">
<mml:msubsup>
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:mi mathvariant="italic">σ</mml:mi>
</mml:mrow>
<mml:mo stretchy="false">ˆ</mml:mo></mml:mover>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">d</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn>2</mml:mn>
</mml:mrow>
</mml:msubsup>
<mml:mo>=</mml:mo><mml:mstyle displaystyle="true">
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mo largeop="false" movablelimits="false">∑</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
<mml:mo stretchy="false">∈</mml:mo>
<mml:mi mathvariant="italic">ω</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo mathvariant="normal" fence="true" stretchy="false">(</mml:mo><mml:mstyle displaystyle="false">
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">z</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">d</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
</mml:mstyle>
<mml:mo>−</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mover accent="false">
<mml:mrow>
<mml:mi mathvariant="italic">z</mml:mi>
</mml:mrow>
<mml:mo accent="true">‾</mml:mo></mml:mover>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">d</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo mathvariant="normal" fence="true" stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:munder>
<mml:mrow>
<mml:mo largeop="false" movablelimits="false">∑</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
<mml:mo stretchy="false">∈</mml:mo>
<mml:mi mathvariant="italic">ω</mml:mi>
</mml:mrow>
</mml:munder><mml:mstyle displaystyle="false">
<mml:mfrac>
<mml:mrow>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">d</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
</mml:mstyle>
</mml:mrow>
</mml:mfrac>
</mml:mstyle>
<mml:mo mathvariant="normal">,</mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable></mml:math><tex-math><![CDATA[\[ {\hat{\sigma }_{d}^{2}}=\frac{{\textstyle\sum _{i\in \omega }}(\frac{{z_{i}}}{{d_{i}}}-{\overline{z}_{d}})}{\textstyle\sum \limits_{i\in \omega }\frac{1}{{d_{i}}}},\]]]></tex-math></alternatives>
</disp-formula>
</disp-formula-group> where <inline-formula id="j_info1205_ineq_093"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:mi mathvariant="italic">μ</mml:mi>
</mml:mrow>
<mml:mo stretchy="false">ˆ</mml:mo></mml:mover>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">d</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${\hat{\mu }_{d}}$]]></tex-math></alternatives></inline-formula> is the local mean and <inline-formula id="j_info1205_ineq_094"><alternatives><mml:math>
<mml:msubsup>
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:mi mathvariant="italic">σ</mml:mi>
</mml:mrow>
<mml:mo stretchy="false">ˆ</mml:mo></mml:mover>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">d</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn>2</mml:mn>
</mml:mrow>
</mml:msubsup></mml:math><tex-math><![CDATA[${\hat{\sigma }_{d}^{2}}$]]></tex-math></alternatives></inline-formula> is the local variance of corresponding vessel features, <italic>ω</italic> is the set of 10 closest vessels, <inline-formula id="j_info1205_ineq_095"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">d</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">i</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${d_{i}}$]]></tex-math></alternatives></inline-formula> is Euclidean distance between vessel patch centres.</p>
<fig id="j_info1205_fig_005">
<label>Fig. 5</label>
<caption>
<p>Result image from <italic>INSPIRE-AVR</italic> database (Niemeijer <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_015">2011</xref>) showing main analysis elements. Blue colour – classified veins, green colour – classified arteries. Main vessels used for <italic>AVR</italic> evaluation are marked.</p>
</caption>
<graphic xlink:href="info1205_g005.jpg"/>
</fig>
<p>After distance based normalization, features are used for vessel classification. The <italic>k</italic>-means clustering method is used. Top vessels and bottom vessels according to <inline-formula id="j_info1205_ineq_096"><alternatives><mml:math>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">OD</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">r</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[${\mathit{OD}_{r}}$]]></tex-math></alternatives></inline-formula> centre point are classified separately. Vessel parts used for features are brighter for arteries than for veins, so a class with a larger feature values is treated as an artery and a class with smaller feature values as vein classes. Two largest vessels from different classes are selected for the top part and for the bottom part. According to the selected vessel widths, top and bottom <italic>AVR</italic> values are evaluated. One eye fundus image with classification results and main analysis elements is presented in Fig. <xref rid="j_info1205_fig_005">5</xref>. Classification results and comparison are presented in the results section.</p>
</sec>
</sec>
<sec id="j_info1205_s_006">
<label>3</label>
<title>Results</title>
<p>The proposed method is tested on three eye fundus image databases. <italic>DRIVE</italic> (Staal <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_020">2004</xref>) image database is one of the most popular databases for different method comparison, but this database is old, and images in this database are of small resolution – <inline-formula id="j_info1205_ineq_097"><alternatives><mml:math>
<mml:mn>565</mml:mn>
<mml:mo>×</mml:mo>
<mml:mn>584</mml:mn></mml:math><tex-math><![CDATA[$565\times 584$]]></tex-math></alternatives></inline-formula> pixels. There are 40 eye fundus images in this database. This database was selected for the comparison in order to show the method’s ability to work automatically on smaller images. Another database used for comparison is <italic>INSPIRE-AVR</italic> (Niemeijer <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_015">2011</xref>). There are also 40 eye fundus images in this database, and the resolution of images is <inline-formula id="j_info1205_ineq_098"><alternatives><mml:math>
<mml:mn>2394</mml:mn>
<mml:mo>×</mml:mo>
<mml:mn>2048</mml:mn></mml:math><tex-math><![CDATA[$2394\times 2048$]]></tex-math></alternatives></inline-formula> pixels. The method is also tested with <italic>OPTO-AVR</italic> database where images are acquired in a complicated situation involving physical exercises. Images are taken before the physical load, right after the physical load and after 45 minutes. Such an experiment is performed in order to test whether changes in <italic>AVR</italic> can show tiredness or stress. Photographs of both eyes are taken with Optomed OY digital mobile eye fundus camera Smartscope M5 PRO. Image resolution is <inline-formula id="j_info1205_ineq_099"><alternatives><mml:math>
<mml:mn>1536</mml:mn>
<mml:mo>×</mml:mo>
<mml:mn>1152</mml:mn></mml:math><tex-math><![CDATA[$1536\times 1152$]]></tex-math></alternatives></inline-formula> pixels, and there are 86 images in this database. Vessel extraction accuracy is the proportion of detected vessels. Vessels with widths larger than <inline-formula id="j_info1205_ineq_100"><alternatives><mml:math>
<mml:mn>3</mml:mn>
<mml:mo>×</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi mathvariant="italic">p</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">s</mml:mi>
</mml:mrow>
</mml:msub></mml:math><tex-math><![CDATA[$3\times {p_{s}}$]]></tex-math></alternatives></inline-formula> and in <italic>AVR</italic> measurement zone are selected only. Extracted vessels were classified in two classes. Classification accuracy, artery class specificity (<inline-formula id="j_info1205_ineq_101"><alternatives><mml:math>
<mml:mi mathvariant="italic">S</mml:mi>
<mml:mi mathvariant="italic">p</mml:mi></mml:math><tex-math><![CDATA[$Sp$]]></tex-math></alternatives></inline-formula>), sensitivity (<inline-formula id="j_info1205_ineq_102"><alternatives><mml:math>
<mml:mi mathvariant="italic">S</mml:mi>
<mml:mi mathvariant="italic">e</mml:mi></mml:math><tex-math><![CDATA[$Se$]]></tex-math></alternatives></inline-formula>) and area under <italic>ROC</italic> curve (<italic>AUC</italic>) statistics were calculated. Vessel detection and classification results for different databases are presented in Table <xref rid="j_info1205_tab_001">1</xref>.</p>
<table-wrap id="j_info1205_tab_001">
<label>Table 1</label>
<caption>
<p>Vessel extraction and classification statistics for different databases.</p>
</caption>
<table>
<thead>
<tr>
<td rowspan="2" style="vertical-align: middle; text-align: left; border-top: solid thin; border-bottom: solid thin">Database</td>
<td rowspan="2" style="vertical-align: middle; text-align: left; border-top: solid thin; border-bottom: solid thin">Vessel extraction accuracy</td>
<td colspan="4" style="vertical-align: top; text-align: left; border-top: solid thin; border-bottom: solid thin">Vessel classification statistics</td>
</tr>
<tr>
<td style="vertical-align: top; text-align: left; border-bottom: solid thin">Accuracy</td>
<td style="vertical-align: top; text-align: left; border-bottom: solid thin">Specificity</td>
<td style="vertical-align: top; text-align: left; border-bottom: solid thin">Sensitivity</td>
<td style="vertical-align: top; text-align: left; border-bottom: solid thin">AUC</td>
</tr>
</thead>
<tbody>
<tr>
<td style="vertical-align: top; text-align: left">DRIVE</td>
<td style="vertical-align: top; text-align: left">0.942</td>
<td style="vertical-align: top; text-align: left">0.854</td>
<td style="vertical-align: top; text-align: left">0.9353</td>
<td style="vertical-align: top; text-align: left">0.822</td>
<td style="vertical-align: top; text-align: left">0.879</td>
</tr>
<tr>
<td style="vertical-align: top; text-align: left">INSPIRE-AVR</td>
<td style="vertical-align: top; text-align: left">0.882</td>
<td style="vertical-align: top; text-align: left">0.859</td>
<td style="vertical-align: top; text-align: left">0.8539</td>
<td style="vertical-align: top; text-align: left">0.862</td>
<td style="vertical-align: top; text-align: left">0.858</td>
</tr>
<tr>
<td style="vertical-align: top; text-align: left; border-bottom: solid thin">OPTO-AVR</td>
<td style="vertical-align: top; text-align: left; border-bottom: solid thin">0.904</td>
<td style="vertical-align: top; text-align: left; border-bottom: solid thin">0.835</td>
<td style="vertical-align: top; text-align: left; border-bottom: solid thin">0.8344</td>
<td style="vertical-align: top; text-align: left; border-bottom: solid thin">0.836</td>
<td style="vertical-align: top; text-align: left; border-bottom: solid thin">0.835</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>Methods used for vessel patch classification in <italic>AVR</italic> measurement zone were selected for the comparison. The selected methods are tested on <italic>INSPIRE-AVR</italic> (Niemeijer <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_015">2011</xref>) and on <italic>DRIVE</italic> (Staal <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_020">2004</xref>) image databases. All data used for comparison is gathered from publications that describe the methods. Methods used for comparison are described in more detail in introduction section and performance information of these methods is presented in Table <xref rid="j_info1205_tab_002">2</xref>.</p>
<table-wrap id="j_info1205_tab_002">
<label>Table 2</label>
<caption>
<p>Vessel classification results for similar methods.</p>
</caption>
<table>
<thead>
<tr>
<td rowspan="2" style="vertical-align: middle; text-align: left; border-top: solid thin; border-bottom: solid thin">Method</td>
<td rowspan="2" style="vertical-align: middle; text-align: left; border-top: solid thin; border-bottom: solid thin">Database</td>
<td colspan="4" style="vertical-align: top; text-align: left; border-top: solid thin; border-bottom: solid thin">Vessel classification statistics</td>
</tr>
<tr>
<td style="vertical-align: top; text-align: left; border-bottom: solid thin">Accuracy</td>
<td style="vertical-align: top; text-align: left; border-bottom: solid thin">Specificity</td>
<td style="vertical-align: top; text-align: left; border-bottom: solid thin">Sensitivity</td>
<td style="vertical-align: top; text-align: left; border-bottom: solid thin">AUC</td>
</tr>
</thead>
<tbody>
<tr>
<td style="vertical-align: top; text-align: left">Muramatsu <italic>et al.</italic> (<xref ref-type="bibr" rid="j_info1205_ref_013">2011</xref>)</td>
<td style="vertical-align: top; text-align: left">DRIVE</td>
<td style="vertical-align: top; text-align: left">0.928</td>
<td style="vertical-align: top; text-align: left">–</td>
<td style="vertical-align: top; text-align: left">0.87</td>
<td style="vertical-align: top; text-align: left">–</td>
</tr>
<tr>
<td style="vertical-align: top; text-align: left">Mirsharif <italic>et al.</italic> (<xref ref-type="bibr" rid="j_info1205_ref_011">2013</xref>)</td>
<td style="vertical-align: top; text-align: left">DRIVE</td>
<td style="vertical-align: top; text-align: left">0.916</td>
<td style="vertical-align: top; text-align: left">–</td>
<td style="vertical-align: top; text-align: left">–</td>
<td style="vertical-align: top; text-align: left">–</td>
</tr>
<tr>
<td style="vertical-align: top; text-align: left">Dashtbozorg <italic>et al.</italic> (<xref ref-type="bibr" rid="j_info1205_ref_003">2014</xref>)</td>
<td style="vertical-align: top; text-align: left">DRIVE</td>
<td style="vertical-align: top; text-align: left">0.883</td>
<td style="vertical-align: top; text-align: left">0.86</td>
<td style="vertical-align: top; text-align: left">0.91</td>
<td style="vertical-align: top; text-align: left">–</td>
</tr>
<tr>
<td style="vertical-align: top; text-align: left">Niemeijer <italic>et al.</italic> (<xref ref-type="bibr" rid="j_info1205_ref_015">2011</xref>)</td>
<td style="vertical-align: top; text-align: left">INSPIRE-AVR</td>
<td style="vertical-align: top; text-align: left">–</td>
<td style="vertical-align: top; text-align: left">–</td>
<td style="vertical-align: top; text-align: left">–</td>
<td style="vertical-align: top; text-align: left">0.84</td>
</tr>
<tr>
<td style="vertical-align: top; text-align: left; border-bottom: solid thin">Dashtbozorg <italic>et al.</italic> (<xref ref-type="bibr" rid="j_info1205_ref_003">2014</xref>)</td>
<td style="vertical-align: top; text-align: left; border-bottom: solid thin">INSPIRE-AVR</td>
<td style="vertical-align: top; text-align: left; border-bottom: solid thin">0.874</td>
<td style="vertical-align: top; text-align: left; border-bottom: solid thin">0.84</td>
<td style="vertical-align: top; text-align: left; border-bottom: solid thin">0.90</td>
<td style="vertical-align: top; text-align: left; border-bottom: solid thin">–</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>Vessel classification results compared with other methods show similar classification results. After analysing misclassification cases, main classification errors occur due to vessel crossings and overlapping and due to lightening issues. In all images analysed, 15 non-vessel places were recognized like vessels due to a vessel-like reflections. <italic>AVR</italic> evaluation was compared with publicly available <italic>INSPIRE-AVR</italic> (Niemeijer <italic>et al.</italic>, <xref ref-type="bibr" rid="j_info1205_ref_015">2011</xref>) data and mean absolute error <inline-formula id="j_info1205_ineq_103"><alternatives><mml:math>
<mml:mi mathvariant="italic">MAE</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>0.093</mml:mn></mml:math><tex-math><![CDATA[$\mathit{MAE}=0.093$]]></tex-math></alternatives></inline-formula> was obtained.  <italic>AVR</italic> evaluations were compared with experts measurements performed with <italic>OPTO-AVR</italic> fundus database, and <inline-formula id="j_info1205_ineq_104"><alternatives><mml:math>
<mml:mi mathvariant="italic">MAE</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>0.156</mml:mn></mml:math><tex-math><![CDATA[$\mathit{MAE}=0.156$]]></tex-math></alternatives></inline-formula> was obtained.</p>
</sec>
<sec id="j_info1205_s_007">
<label>4</label>
<title>Conclusions</title>
<p>The method proposed in this paper is used for fully automatic eye fundus image based blood vessel measurements, vessel classification and evaluation of artery-vein ratio according to this information. This parameter can be used for early detection of some diseases. Here, novel features, based on vessel spatial information, are used for vessel classification. Features are normalized taking into account the spatial distance between vessels. It reduces the influence of uneven image lightening. Extracted features are used with the <italic>k</italic>-means method, but the same features can also be used with other common classification methods. The method is tested with <italic>DRIVE</italic>, <italic>INSPIRE-AVR</italic> and with <italic>OPTO-AVR</italic> eye fundus image databases. The accuracy of vessel classification is obtained equal to 0.854, 0.859 and 0.835, respectively. The proposed unsupervised method is competitive to other state-of-the-art supervised classification methods from this field. As it is common for unsupervised methods, the proposed method produces slightly less accurate classification results, but has the advantage due to its universality, because the proposed method does not require adjustment for new image sets, is adaptive to different image sizes and deals with noisy or unevenly lightened images.</p>
</sec>
</body>
<back>
<ref-list id="j_info1205_reflist_001">
<title>References</title>
<ref id="j_info1205_ref_001">
<mixed-citation publication-type="other"><string-name><surname>Bankhhead</surname>, <given-names>P.</given-names></string-name>, <string-name><surname>Scholfield</surname>, <given-names>C.N.</given-names></string-name>, <string-name><surname>mc Geown</surname>, <given-names>J.G.</given-names></string-name>, <string-name><surname>Curtis</surname>, <given-names>T.M.</given-names></string-name> (2012). Fast retinal vessel detection and measurement using wavelets and edge location refinement. <italic>PLOS ONE</italic>, 7.</mixed-citation>
</ref>
<ref id="j_info1205_ref_002">
<mixed-citation publication-type="journal"><string-name><surname>Buteikienė</surname>, <given-names>D.</given-names></string-name>, <string-name><surname>Paunksnis</surname>, <given-names>A.</given-names></string-name>, <string-name><surname>Barzdžiukas</surname>, <given-names>V.</given-names></string-name>, <string-name><surname>Bernatavičienė</surname>, <given-names>J.</given-names></string-name>, <string-name><surname>Marcinkevičius</surname>, <given-names>V.</given-names></string-name>, <string-name><surname>Treigys</surname>, <given-names>P.</given-names></string-name> (<year>2012</year>). <article-title>Assessment of the optic nerve disc and excavation parameters of interactive and automated parameterization methods</article-title>. <source>Informatica</source>, <volume>23</volume>(<issue>3</issue>), <fpage>335</fpage>–<lpage>355</lpage>.</mixed-citation>
</ref>
<ref id="j_info1205_ref_003">
<mixed-citation publication-type="journal"><string-name><surname>Dashtbozorg</surname>, <given-names>B.</given-names></string-name>, <string-name><surname>Mendonca</surname>, <given-names>A.M.</given-names></string-name>, <string-name><surname>Campilho</surname>, <given-names>A.</given-names></string-name> (<year>2014</year>). <article-title>An automatic graph-based approach for artery/vein classification in retinal images</article-title>. <source>IEEE Transactions on Image Processing</source>, <volume>23</volume>(<issue>3</issue>), <fpage>1073</fpage>–<lpage>1083</lpage>.</mixed-citation>
</ref>
<ref id="j_info1205_ref_004">
<mixed-citation publication-type="chapter"><string-name><surname>Fairfield</surname>, <given-names>J.</given-names></string-name> (<year>1990</year>). <chapter-title>Toboggan contrast enhancement for contrast segmentation</chapter-title>. In: <source>Proceedings of International Conference on Pattern Recognition</source>, pp. <fpage>712</fpage>–<lpage>716</lpage>.</mixed-citation>
</ref>
<ref id="j_info1205_ref_005">
<mixed-citation publication-type="chapter"><string-name><surname>Fraz</surname>, <given-names>M.M.</given-names></string-name>, <string-name><surname>Rudnicka</surname>, <given-names>A.R.</given-names></string-name>, <string-name><surname>Owen</surname>, <given-names>C.G.</given-names></string-name>, <string-name><surname>Strachan</surname>, <given-names>D.P.</given-names></string-name>, <string-name><surname>Barman</surname>, <given-names>S.A.</given-names></string-name> (<year>2014</year>). <chapter-title>Automated arteriole and venule recognition in retinal images using ensemble classification</chapter-title>. In: <source>2014 International Conference on Computer Vision Theory and Applications (VISAPP)</source>, Vol. <volume>3</volume>, pp. <fpage>194</fpage>–<lpage>202</lpage>.</mixed-citation>
</ref>
<ref id="j_info1205_ref_006">
<mixed-citation publication-type="journal"><string-name><surname>Knudtson</surname>, <given-names>M.D.</given-names></string-name>, <string-name><surname>Lee</surname>, <given-names>K.E.</given-names></string-name>, <string-name><surname>Hubbard</surname>, <given-names>L.D.</given-names></string-name>, <string-name><surname>Wong</surname>, <given-names>T.Y.</given-names></string-name>, <string-name><surname>Klein</surname>, <given-names>R.</given-names></string-name>, <string-name><surname>Klein</surname>, <given-names>B.E.K.</given-names></string-name> (<year>2003</year>). <article-title>Revised formulas for summarizing retinal vessel diameters</article-title>. <source>Current Eye Research</source>, <volume>27</volume>(<issue>3</issue>), <fpage>143</fpage>–<lpage>149</lpage>.</mixed-citation>
</ref>
<ref id="j_info1205_ref_007">
<mixed-citation publication-type="chapter"><string-name><surname>Li</surname>, <given-names>H.</given-names></string-name>, <string-name><surname>Hsu</surname>, <given-names>W.</given-names></string-name>, <string-name><surname>Lee</surname>, <given-names>M.L.</given-names></string-name>, <string-name><surname>Wang</surname>, <given-names>H.</given-names></string-name> (<year>2003</year>). <chapter-title>A piecewise Gaussian model for profiling and differentiating retinal vessels</chapter-title>. In: <source>Proceedings 2003 International Conference on Image Processing (Cat. No. 03CH37429)</source>, Vol. <volume>1</volume>. <comment>I-1069-72</comment>.</mixed-citation>
</ref>
<ref id="j_info1205_ref_008">
<mixed-citation publication-type="journal"><string-name><surname>Li</surname>, <given-names>X.</given-names></string-name>, <string-name><surname>Wee</surname>, <given-names>W.G.</given-names></string-name> (<year>2014</year>). <article-title>Retinal vessel detection and measurement for computer-aided medical diagnosis</article-title>. <source>Journal of Digital Imaging</source>, <volume>27</volume>, <fpage>120</fpage>–<lpage>132</lpage>.</mixed-citation>
</ref>
<ref id="j_info1205_ref_009">
<mixed-citation publication-type="book"><string-name><surname>Mendonca</surname>, <given-names>A.</given-names></string-name>, <string-name><surname>Dashtbozorg</surname>, <given-names>B.</given-names></string-name>, <string-name><surname>Campilho</surname>, <given-names>A.</given-names></string-name> (<year>2013</year>). <source>Segmentation of the vascular network of the retina</source>. <series>Image Analysis and Modeling in Opthalmology</series>. <publisher-name>CRC Press</publisher-name>, <publisher-loc>USA</publisher-loc>.</mixed-citation>
</ref>
<ref id="j_info1205_ref_010">
<mixed-citation publication-type="journal"><string-name><surname>Miri</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>Amini</surname>, <given-names>Z.</given-names></string-name>, <string-name><surname>Rabbani</surname>, <given-names>H.</given-names></string-name>, <string-name><surname>Kafie</surname>, <given-names>R.</given-names></string-name> (<year>2017</year>). <article-title>A comprehensive study of retinal vessel classification methods in fundus images</article-title>. <source>Journal of Medical Signals &amp; Sensors</source>, <volume>7</volume>(<issue>2</issue>), <fpage>59</fpage>–<lpage>70</lpage>.</mixed-citation>
</ref>
<ref id="j_info1205_ref_011">
<mixed-citation publication-type="journal"><string-name><surname>Mirsharif</surname>, <given-names>Q.</given-names></string-name>, <string-name><surname>Tajeripour</surname>, <given-names>F.</given-names></string-name>, <string-name><surname>Pourreza</surname>, <given-names>H.</given-names></string-name> (<year>2013</year>). <article-title>Automated characterization of blood vessels as arteries and veins in retinal images</article-title>. <source>Computerized Medical Imaging and Graphics</source>, <volume>37</volume>, <fpage>607</fpage>–<lpage>617</lpage>.</mixed-citation>
</ref>
<ref id="j_info1205_ref_012">
<mixed-citation publication-type="journal"><string-name><surname>Morkunas</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>Treigys</surname>, <given-names>P.</given-names></string-name>, <string-name><surname>Bernataviciene</surname>, <given-names>J.</given-names></string-name>, <string-name><surname>Laurinavicius</surname>, <given-names>A.</given-names></string-name>, <string-name><surname>Korvel</surname>, <given-names>G.</given-names></string-name> (<year>2018</year>). <article-title>Machine learning based classification of colorectal cancer tumour tissue in whole-slide images</article-title>. <source>Informatica</source>, <volume>29</volume>(<issue>1</issue>), <fpage>75</fpage>–<lpage>90</lpage>.</mixed-citation>
</ref>
<ref id="j_info1205_ref_013">
<mixed-citation publication-type="journal"><string-name><surname>Muramatsu</surname>, <given-names>C.</given-names></string-name>, <string-name><surname>Hatanaka</surname>, <given-names>Y.</given-names></string-name>, <string-name><surname>Iwase</surname>, <given-names>T.</given-names></string-name>, <string-name><surname>Hara</surname>, <given-names>T.</given-names></string-name>, <string-name><surname>Fujita</surname>, <given-names>H.</given-names></string-name> (<year>2011</year>). <article-title>Automated selection of major arteries and veins for measurement of arteriolar-to-venular diameter ratio on retinal fundus images</article-title>. <source>Computerized Medical Imaging and Graphics</source>, <volume>35</volume>, <fpage>472</fpage>–<lpage>480</lpage>.</mixed-citation>
</ref>
<ref id="j_info1205_ref_014">
<mixed-citation publication-type="journal"><string-name><surname>Niemeijer</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>Abramoff</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>van Ginneken</surname>, <given-names>B.</given-names></string-name> (<year>2009</year>). <article-title>Fast detection of the optic disc and fovea in color fundus photographs</article-title>. <source>Medical Image Analysis</source>, <volume>13</volume>(<issue>6</issue>), <fpage>859</fpage>–<lpage>870</lpage>.</mixed-citation>
</ref>
<ref id="j_info1205_ref_015">
<mixed-citation publication-type="journal"><string-name><surname>Niemeijer</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>Xu</surname>, <given-names>X.</given-names></string-name>, <string-name><surname>Dumitrescu</surname>, <given-names>A.</given-names></string-name>, <string-name><surname>Gupta</surname>, <given-names>P.</given-names></string-name>, <string-name><surname>van Ginneken</surname>, <given-names>B.</given-names></string-name>, <string-name><surname>Folk</surname>, <given-names>J.</given-names></string-name>, <string-name><surname>Abramoff</surname>, <given-names>M.</given-names></string-name> (<year>2011</year>). <article-title>Automated measurement of the arteriolar-to-venular width ratio in digital color fundus photographs</article-title>. <source>IEEE Transactions on Medical Imaging</source>, <volume>30</volume>(<issue>11</issue>), <fpage>1941</fpage>–<lpage>1950</lpage>.</mixed-citation>
</ref>
<ref id="j_info1205_ref_016">
<mixed-citation publication-type="journal"><string-name><surname>Prasath</surname>, <given-names>V.B.S.</given-names></string-name> (<year>2017</year>). <article-title>Quantum noise removal in X-ray images with adaptive total variation regularization</article-title>. <source>Informatica</source>, <volume>28</volume>(<issue>3</issue>), <fpage>505</fpage>–<lpage>515</lpage>.</mixed-citation>
</ref>
<ref id="j_info1205_ref_017">
<mixed-citation publication-type="chapter"><string-name><surname>Ravishankar</surname>, <given-names>S.</given-names></string-name>, <string-name><surname>Jain</surname>, <given-names>A.</given-names></string-name>, <string-name><surname>Mittal</surname>, <given-names>A.</given-names></string-name> (<year>2009</year>). <chapter-title>Automated feature extraction for early detection of diabetic retinopathy in fundus images</chapter-title>. In: <source>2009 IEEE Conference on Computer Vision and Pattern Recognition</source>, <conf-loc>Miami, FL</conf-loc>, pp. <fpage>210</fpage>–<lpage>217</lpage>.</mixed-citation>
</ref>
<ref id="j_info1205_ref_018">
<mixed-citation publication-type="other"><string-name><surname>Renukalatha</surname>, <given-names>S.</given-names></string-name>, <string-name><surname>Suresh</surname>, <given-names>K.V.</given-names></string-name> (2018). A review on biomedical image analysis. <italic>Biomedical Engineering: Applications, Basis and Communications</italic>, <italic>30</italic>(4).</mixed-citation>
</ref>
<ref id="j_info1205_ref_019">
<mixed-citation publication-type="journal"><string-name><surname>Soares</surname>, <given-names>J.V.B.</given-names></string-name>, <string-name><surname>Leandro</surname>, <given-names>J.J.G.</given-names></string-name>, <string-name><surname>Cesar</surname>, <given-names>R.M.</given-names></string-name>, <string-name><surname>Jelinek</surname>, <given-names>H.F.</given-names></string-name>, <string-name><surname>Cree</surname>, <given-names>M.J.</given-names></string-name> (<year>2006</year>). <article-title>Retinal vessel segmentation using the 2-D Gabor wavelet and supervised classification</article-title>. <source>IEEE Transactions on Medical Imaging</source>, <volume>25</volume>(<issue>9</issue>), <fpage>1214</fpage>–<lpage>1222</lpage>.</mixed-citation>
</ref>
<ref id="j_info1205_ref_020">
<mixed-citation publication-type="journal"><string-name><surname>Staal</surname>, <given-names>J.</given-names></string-name>, <string-name><surname>Abramoff</surname>, <given-names>M.D.</given-names></string-name>, <string-name><surname>Niemeijer</surname>, <given-names>M.</given-names></string-name>, <string-name><surname>Viergever</surname>, <given-names>M.A.</given-names></string-name>, <string-name><surname>van Ginneken</surname>, <given-names>B.</given-names></string-name> (<year>2004</year>). <article-title>Ridge based vessel segmentation in color images of the retina</article-title>. <source>IEEE Transactions on Medical Imaging</source>, <volume>23</volume>, <fpage>501</fpage>–<lpage>509</lpage>.</mixed-citation>
</ref>
<ref id="j_info1205_ref_021">
<mixed-citation publication-type="chapter"><string-name><surname>Stabingis</surname>, <given-names>G.</given-names></string-name>, <string-name><surname>Bernatavičienė</surname>, <given-names>J.</given-names></string-name>, <string-name><surname>Dzemyda</surname>, <given-names>G.</given-names></string-name>, <string-name><surname>Imbrasienė</surname>, <given-names>D.</given-names></string-name>, <string-name><surname>Paunksnis</surname>, <given-names>A.</given-names></string-name> (<year>2016</year>). <chapter-title>Automated classification of arteries and veins in the retinal blood vasculature</chapter-title>. In: <source>Proceedings of the 11th International Conference on Theoretical &amp; Applied Stochastics</source>, <conf-loc>Minsk, Belarus</conf-loc>.</mixed-citation>
</ref>
<ref id="j_info1205_ref_022">
<mixed-citation publication-type="chapter"><string-name><surname>Stabingis</surname>, <given-names>G.</given-names></string-name>, <string-name><surname>Bernatavičienė</surname>, <given-names>J.</given-names></string-name>, <string-name><surname>Dzemyda</surname>, <given-names>G.</given-names></string-name>, <string-name><surname>Paunksnis</surname>, <given-names>A.</given-names></string-name>, <string-name><surname>Treigys</surname>, <given-names>P.</given-names></string-name>, <string-name><surname>Vaičaitienė</surname>, <given-names>R.</given-names></string-name>, <string-name><surname>Stabingienė</surname>, <given-names>L.</given-names></string-name> (<year>2018</year>). <chapter-title>Automatization of eye fundus vessel width measurements</chapter-title>. In: <source>VipIMAGE 2017. ECCOMAS 2017</source>, <series><italic>Lecture Notes in Computational Vision and Biomechanics</italic></series>,: Vol. <volume>27</volume>, pp. <fpage>787</fpage>–<lpage>796</lpage>.</mixed-citation>
</ref>
<ref id="j_info1205_ref_023">
<mixed-citation publication-type="journal"><string-name><surname>Sun</surname>, <given-names>C.</given-names></string-name>, <string-name><surname>Wang</surname>, <given-names>J.J.</given-names></string-name>, <string-name><surname>Mackey</surname>, <given-names>D.A.</given-names></string-name>, <string-name><surname>Wong</surname>, <given-names>T.Y.</given-names></string-name> (<year>2009</year>). <article-title>Retinal vascular caliber: systemic, environmental, and genetic associations</article-title>. <source>Survey of Ophthalmology</source>, <volume>54</volume>(<issue>1</issue>), <fpage>74</fpage>–<lpage>95</lpage>.</mixed-citation>
</ref>
<ref id="j_info1205_ref_024">
<mixed-citation publication-type="chapter"><string-name><surname>Treigys</surname>, <given-names>P.</given-names></string-name>, <string-name><surname>Dzemyda</surname>, <given-names>G.</given-names></string-name>, <string-name><surname>Barzdžiukas</surname>, <given-names>V.</given-names></string-name> (<year>2008</year>). <chapter-title>Automated positioning of overlapping eye fundus images</chapter-title>. In: <source>Computational Science – ICCS 2008. ICCS 2008</source>, <series><italic>Lecture Notes in Computer Science</italic></series>, Vol. <volume>5101</volume>, pp. <fpage>770</fpage>–<lpage>779</lpage>.</mixed-citation>
</ref>
</ref-list>
</back>
</article>