Please use this identifier to cite or link to this item: http://repository.elizadeuniversity.edu.ng/jspui/handle/20.500.12398/528
Full metadata record
DC FieldValueLanguage
dc.contributor.authorKader, S.-
dc.contributor.authorAibinu, A. M.-
dc.contributor.authorSalami, Momoh-Jimoh E.-
dc.date.accessioned2019-08-16T13:24:46Z-
dc.date.available2019-08-16T13:24:46Z-
dc.date.issued2012-12-17-
dc.identifier.citationKader, S., Aibinu, A. M., & Salami, M. J. E. (2012, December). A new method of vascular point detection using artificial neural network. In 2012 IEEE-EMBS Conference on Biomedical Engineering and Sciences (pp. 728-733). IEEE.en_US
dc.identifier.urihttp://repository.elizadeuniversity.edu.ng/jspui/handle/20.500.12398/528-
dc.description.abstractVascular intersection is an important feature in retina fundus image (RFI). It can be used to monitor the progress of diabetes hence accurately determining vascular point is of utmost important. In this work a new method of vascular point detection using artificial neural network model has been proposed. The method uses a 5×5 window in order to detect the combination of bifurcation and crossover points in a retina fundus image. Simulated images have been used to train the artificial neural network and on convergence the network is used to test (RFI) from DRIVE database. Performance analysis of the system shows that ANN based technique achieves 100% accuracy on simulated images and minimum of 92% accuracy on RFI obtained from DRIVE database.en_US
dc.language.isoenen_US
dc.publisherIEEEen_US
dc.subjectDiabetic Retinopathyen_US
dc.subjectRetinaen_US
dc.subjectVascular pointsen_US
dc.subjectArtificial Neural Networken_US
dc.titleA new method of vascular point detection using artificial neural networken_US
dc.typeArticleen_US
Appears in Collections:Research Articles

Files in This Item:
File Description SizeFormat 
A new method of vascular point detection using artificial neural network.pdfArticle full-text676.23 kBAdobe PDFThumbnail
View/Open


Items in EUSpace are protected by copyright, with all rights reserved, unless otherwise indicated.