• Users Online: 61
  • Print this page
  • Email this page

 Table of Contents  
ORIGINAL ARTICLE
Year : 2020  |  Volume : 10  |  Issue : 4  |  Page : 249-259

A new smart CMOS image sensor with on-chip neuro-fuzzy bleeding detection system for wireless capsule endoscopy


Department of Astronautics Research, Aerospace Research Institute, Ministry of Science Research and Technology, Tehran, Iran

Date of Submission07-Oct-2019
Date of Decision30-Oct-2019
Date of Acceptance29-Jan-2020
Date of Web Publication11-Nov-2020

Correspondence Address:
Dr. Peiman Aliparast
Aerospace Research Institute, Ministry of Science Research and Technology, Tehran 14665-834
Iran
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/jmss.JMSS_56_19

Rights and Permissions
  Abstract 


Background: In this paper, we have presented a new custom smart CMOS image sensor (CIS) for low power wireless capsule endoscopy. Method: The proposed new smart CIS includes a 256 × 256 current mode pixels array with a new on-chip adaptive neuro-fuzzy inference system that has been used to diagnosing bleeding images. We use a new pinned photodiode to realize the current mode of active pixels in the standard CMOS process. The proposed chip has been implemented in 0.18 μm CMOS 1P6M TSMC RF technology with a die area of 7 mm × 8 mm. Results and Conclusion: A built-in smart bleeding detection system on CIS leads to decrease in the RF transmitter power consumption near zero. The average power dissipation of the proposed smart CIS is 610 μW.

Keywords: CMOS, image sensors, neuro-fuzzy, photodiode, pixel, wireless capsule endoscopy


How to cite this article:
Aliparast P. A new smart CMOS image sensor with on-chip neuro-fuzzy bleeding detection system for wireless capsule endoscopy. J Med Signals Sens 2020;10:249-59

How to cite this URL:
Aliparast P. A new smart CMOS image sensor with on-chip neuro-fuzzy bleeding detection system for wireless capsule endoscopy. J Med Signals Sens [serial online] 2020 [cited 2020 Nov 23];10:249-59. Available from: https://www.jmssjournal.net/text.asp?2020/10/4/249/300505




  Introduction Top


Recent years have seen the rapid emergence of CMOS image sensors (CIS) as the technology of choice for a wide range of consumer products from mobile phones to digital cameras or camcorders. CIS has also exciting potential in biomedical applications. For instance, an endoscope system employs a CIS to capture a picture from the gastrointestinal (GI) tract of humans. The continuous quest for painless diagnostic procedures in the GI tract has resulted in greater interest in endoluminal techniques, such as capsular endoscopy.[1] An endoscopic capsule is a swallowable self-contained microsystem which performs a sensing or actuating function in the body.[2] [Figure 1] shows a specimen of the first-generation endoscopic capsule.[3] The first capsule endoscope model was developed by given imaging in 2000[4] and received medical approval from FDA in Western countries in 2001. The first capsule was commercialized by Given Imaging with the name of PillCam TMSB, specially designed for small bowel investigation. Essentially, the PillCam TMSB is a swallowable wireless miniaturized camera which provides images. Despite research on actuation,[5] drug delivery, and biopsy techniques that may be implemented in an endoscopic capsule,[2],[6],[7] the imaging unit is still the core part of the system. The main goal of endoscopy is to inspect the inside of the body through imaging techniques for diagnostic and surgical purposes. The main design challenges lie on the miniature-sized capsule. First, the battery-operated capsule is power-constrained strictly, and effective low-power techniques need to be employed to ensure adequately long working time. Second, the miniaturization requirement of the system leads to a strict constraint on the size of the printed circuit board used for the capsule, and the circuits in the capsule should be highly integrated and the number of off-chip components should be reduced as few as possible.[4] For achieving the above targets, we have proposed new system architecture as a smart CIS in the next section.
Figure 1: Wireless capsule endoscopy, the procedure features in left and the capsule details in right[3]

Click here to view



  The Proposed System Architecture Top


We know that bleeding is the most common disease in the GI tract, and many other diseases are often accompanied by bleeding.[8] Meanwhile, wireless capsule endoscopy (WCE) has been proposed for use in the management of patients who have obscure digestive tract bleeding.[9] In the proposed structure, with considering this fact that the more than 50% of the power, consumes by RF section of WCE, we have used a dedicated low power neuro-fuzzy controller unit to reduce the power dissipation of this section. In fact, we have integrated a bleeding detection system on CIS. In the proposed smart CIS, a bleeding detection unit experiment each frame of a captured image and decide about bleeding and nonbleeding of it. If the image recognition as a bleeding picture the readout, analog to digital converter (ADC) and RF transmitter units will start to work and will take 50 pictures for sending to wireless recorder on the belt of patient, but if the captured image recognize as a nonbleeding picture none of the subsystems mentioned above is working. This technique leads to sending only a few pictures instead of more than 60,000 pictures in comparison with conventional WCE. On the other hand, in the conventional WCE because of a large amount of captured images (more than 60,000), it is very laborious and time-consuming for physicians to detect the bleeding regions or other related abnormal characters. It takes more than 3 h to inspect these images even by a skilled expert, which is not only time-consuming but also leading to false detection because of visual fatigue. Therefore, the physician needs to use a computer-aided intelligent bleeding detection technique to process the WCE video automatically. In the proposed smart CIS, this intelligent bleeding detection system integrated with CIS on the same chip. For realizing a blood detection system, we design an analog CMOS neuro-fuzzy controller with three inputs and one output. Using fuzzy systems needs a series of “if-then” rules to be defined based on human knowledge which is usually done by experts. Furthermore, we face a problem in determining and adjusting membership functions (MFs) to obtain maximum performance and minimum output errors. We are faced with a similar problem with choosing output function parameters. If we have a collection of input/output data, a proper combination of two powerful theories, neural networks, and fuzzy systems, helps us to solve the problems mentioned above. Adaptive neuro-fuzzy inference system (ANFIS) architecture using learning ability and optimizing weights of neural networks and decision-making strength with linguistic use of fuzzy logic has provided a 5-layer neuro-fuzzy system.[10] [Figure 2] shows the block diagram of the proposed system architecture for new smart CIS. As shown in [Figure 2], two ANFIS blocks work together in the parallel to detect bleeding in the captured picture from the GI tract. In this mode, CIS works with 2 frames per se cond. When the ANFIS detects an abnormality in the GI tract of the patient, the system changes to send mode and start to take and send pictures with 5 frames per se cond for 10 s. For the implementation of the proposed architecture, we use the commercial 0.18 μm CMOS RF technology that let us integrated CIS, processing units and RF transmitter on the same chip as a System on Chip. In this paper, at first, we have presented the proposed blood-base abnormalities detection structure using neuro-fuzzy processing units (section III). We have explained the circuit design and implementation of the proposed smart CIS structure in section IV. In section V, the simulation results of the proposed smart CIS were presented. Finally, section VI appertains to conclude this article.
Figure 2: Block diagram of the proposed smart CMOS image sensor structure

Click here to view



  Adaptive Neuro-Fuzzy Inference System for Blood-Based Abnormalities Detection Top


Blood-based abnormalities in the small bowel are characterized into three categories:

  • Bleeding,
  • Angioectasia, and
  • Erythema.


[Figure 3] shows a sample picture for each of these categories that have taken by PillCam.[11] Bleeding is defined as the flow of blood from a ruptured blood vessel into the digest tract [Figure 3]a. Among malignant tumors, leiomyosarcoma is most commonly associated with bleeding.[12] In medical bibliography, angioectasia is also referred to as arteriovenous malformations. It is the most common abnormality accounting for obscure GI bleeding, seen in 53% of patients who undergo capsule endoscopy.[13] This makes their detection an important task. They occur more frequently with increasing age and can be identified at endoscopy as spiderlike lesions.[14] Since most of the angioectasia cases in the small intestine are caused by blood vessels inside the intestine walls. In comparison with bleeding cases, angioectasia stays inside the intestine walls. The color of an angioectasia appears often more reddish than the color of bleeding [Figure 3]b. In general, erythema is defined as skin redness caused by capillary congestion. In the small intestine, erythema multiforme is strongly related with various abnormalities (i.e., Crohn's disease). It is an acute, self-limiting, inflammatory skin eruption. Its redness is lesser than that of angioectasia [Figure 3]c. In a study by Al-Rahayfeh and Abuzneid,[15] automation for classifying with using the purity of the red color to detect the bleeding areas in WCE images has been reported. They have divided each image into multiple pixels and applied the range ratio color condition for each pixel and then they have counted the number of the pixels that achieved condition. If the number of pixels was greater than zero, then the frame was classified as a bleeding type. Experimental results of this simple algorithm show 98% accuracy without using any smart classification such as ANFIS. In a study by Karargyris and Bourbakis,[16] a novel methodology for automatically detecting blood-based abnormalities (bleeding, angioectasia, and erythema) in WCE videos has been presented. The methodology was based on the synergistic integration of methods, such as Color K-L transformation, fuzzy region segmentation, and Local-Global graphs. The methodology presented several unique features that separate it from the classical bleeding-based methods. These features were mainly based on the classification of the blood-based pathological cases in three major categories approved by gastroenterologists. Hence, we have used the same classification of the blood-based pathological cases in our proposed detection system. Also, in a study by Kodogiannis et al.,[17] an integrated methodology for detecting abnormal patterns in WCE images has been reported. Two issues were being addressed, including the extraction of texture features from the texture spectra in the chromatic and achromatic domains from each color component histogram of WCE images and the concept of a fusion of multiple classifiers. The implementation of an advanced neuro-fuzzy learning scheme was been also adopted. The high detection accuracy of the proposed system has provided thus an indication that such intelligent schemes could be used as a supplementary diagnostic tool in WCE. Furthermore, the measured 98.5% accuracy for the proposed fuzzy logic system was reported. Our proposed smart detection system has customized for detecting the above-mentioned abnormality cases by decision-making in the color space and for the smart decision system, we used ANFIS structure. The proposed detection system composes of two ANFIS structures that work in parallel. As shown in [Figure 4], the area of CIS (256 × 256 pixel arrays) divides into 32 × 32 regions (1024 segments) that each segment composes of the 8 × 8 pixel matrix. With consideration of color filter array structure, the 8 × 8 pixel matrix has 32 pixels for the green color, 16 pixels for the red color and 16 pixels for the blue colors [Figure 4]. We compute the average of each color in this area and fed it into the ANFIS units. Hence, the minimum detection area limited to 700 μm × 700 μm. ANFIS estimates the color of this area and recognizes this area has an abnormality or not. It is clear that with an increase of the processing unit, delay of the detection system can be decreased but the active area on the chip and power consumption will increase.
Figure 3: Blood-based abnormalities: (a) Bleeding, (b) Angioectasia and (c) Erythema, (Crohn's Disease)[11]

Click here to view
Figure 4: The area of CMOS image sensor (256 × 256 pixel arrays) divides into 32 × 32 regions (1024 segments) that each segment composes of the 8 × 8 pixel matrix

Click here to view


In this application, we used two ANFIS in the trade-off between speed, power consumption, and active area. Hence, each of the ANFIS controllers processes half of the captured image. We trained the proposed ANFIS with 65 samples of the pictures for each of the abnormalities mentioned above. [Figure 5] shows a sample picture for training the proposed ANFIS. The hybrid training algorithm is helped us to achieve optimal MFs and output function coefficients after 26 epochs. We obtained 98% bleeding detection accuracy and output classified perceptron tuned in 19% regards to the adaptive learning system. [Figure 6] shows the ANFIS controller with three inputs (IRed, IBlue, and IGreen), one output (probability of bleeding [POB]), and 24 rules which have been implemented in MATLAB. The first input (IRed) is divided into six MFs (because the decision vector rotated in the color field along the red axis), the second (IGreen) and third (IBlue) are divided into two MF in the trade-off between hardware requirements and accuracy. As shown in [Figure 6], every node in layer 1 is a fuzzifier node. In this layer, we need a bell-shaped function as shown in [Figure 7], so that we can change a, b and c parameters based on learning pattern to obtain the optimum shape of the MF. The equation of this layer is shown in Eq. 1.
Figure 5: The sample picture for training the proposed adaptive neuro-fuzzy inference system

Click here to view
Figure 6: The proposed adaptive neuro-fuzzy inference system structure for blood-based abnormalities detection

Click here to view
Figure 7: Generalized bell-shaped membership function and its parameters

Click here to view




Where O1i is the output of the first layer and μAi(x) is a MF with equation of Eq. 2.



In the second layer, fuzzy AND operator operates on the MFs and produces the weight of the fire of each rules. The Eq. 3 shows the relation of each weight.



In the third layer, the weights are normalized. The Eq. 4 shows the relation of this layer.



The output of the fourth layer regards to TSK method is shown in Eq. 5.



Finally, the output of the proposed ANFIS (POB) indicated in Eq. 6.



The output of the each ANFIS (POB), feds to a single current to voltage converted perceptron.


  Circuit Design and Implementation Top


Regards to system architecture that we have presented in section II, we need to implement the blocks in a CMOS process on a chip die.

CMOS image sensor implementation

The main problem for realizing the proposed architecture is the implication of the CIS. The CIS needs a custom expensive process for implementation and integration of the commercial signal processing blocks with the custom CIS process is very hard. This problem was solved in a study by Aliparast and Koozehkanai.[18] In a study by Aliparast and Koozehkanai,[18] a method to implement the current mode active pixel sensor (APS) in standard commercial 0.18 μm TSMC RF-CMOS processes is demonstrated. [Figure 8] shows the process parameters of the standard 0.18 μm CMOS technology used for the avalanche PD (APD). One of the most important advantaged of the presented APS is the integration signal amplification and voltage to the current convertor inside the collection area of the pixel that increases the sensitivity of the device due to the amplification in the pixel.
Figure 8: The process parameters of the standard 0.18 μm CMOS technology used for the avalanche photodiode[18]

Click here to view


In addition, in-Pixel DRS helps to implement the on-chip signal processing structures. [Figure 9] shows the circuit schematic of the APS with in pixel delta reset sampling. Furthermore, the main specifications of the APD are summarized in [Table 1]. The photodiode (PD) size is 10 μm × 10 μm where pixel size is 21 μm × 23 μm. [Figure 10] shows the layout of the APS structure. The APS covered the light intensity from 0.1 lux to 27000 lux. Average power dissipation for each pixel, including integration period, DRS timing, and hold period is 19.8 μW. With respect to pixel performances, the pixels can be used widely in smart CIS with image processing at the focal plane through low power analog circuits. Regards to the current mode structure, the output current of the pixels gather together for applying to the ANFIS.
Figure 9: Circuit schematic of the active pixel sensor with in-pixel delta reset sampling

Click here to view
Table 1: Main specifications of the active pixel sensor

Click here to view
Figure 10: Layout of the active pixel sensor structure

Click here to view


Adaptive neuro-fuzzy inference system implementation

We have illustrated the implementation of the ANFIS structure on standard CMOS processes based on the proposed blood-based abnormalities detection inference engine that described in section III. The circuit for implementation of the ANFIS in 0.35 μm CMOS process was reported in a study by Aliparast et al.[19] Here, we have used the same circuits with minor modification and also optimizations for implementing in 0.18 μm TSMC RF-CMOS processes. The first layer in the ANFIS is the fuzzifier node. [Figure 11] shows the fuzzifier circuit for generating the bell-shaped MFs. The reference current (fuzzy one) assumes 5 μA in tradeoff between speed and power consumption. The SPICE simulation result for the proposed fuzzifier circuit regards to parameters changing is shown in [Figure 12]. The second layer in the ANFIS structure is the fuzzy AND operator. We used a Min-Max circuit for implemented the second layer. [Figure 13] shows the proposed current mode circuits for implementing Min-Max.
Figure 11: The proposed circuit for generating bell-shaped membership function

Click here to view
Figure 12: The SPICE simulation results for the proposed fuzzifier circuit regards to parameters changing

Click here to view
Figure 13: The proposed circuit for implementing min-max

Click here to view


[Figure 14] shows the sample simulation result for the proposed Min-Max circuit with two bell-shaped MFs.
Figure 14: The simulation results of the proposed circuit for implementing min-max

Click here to view


To implement the functionality of the layer 3 and 4, we have used a multiplier/divider circuit based on MOS-translinear architecture. [Figure 15] illustrates the main idea of this work. At first, the current of Wi is multiplied by related the current of singleton Ci by g-mean, and its square-root is calculated, then squarer/divider circuit calculates the square of the signal and divides it by Wt (Wt = ΣWi). The internal structure of g-mean and squarer is similar and based on the voltage-translinear loop. [Figure 16] shows the circuit of used g-mean. The output of the g-mean circuit is . The squarer/divider architecture is shown in [Figure 17]. The circuit does the operation of the power of two and dividing on its inputs so that it calculates the square of input and divides it by Wt. So in output we have:
Figure 15: Block diagram of the proposed multiplier/divider circuit

Click here to view
Figure 16: The internal circuit of the g-mean block

Click here to view
Figure 17: The internal circuit of the squarer block

Click here to view




So, now with using KCL for summing the output currents, finally we have:



Hence, the current can be amplified as much as needed by a current mirror and sent to the input of the single-layer perceptron.

Analog to digital converter implementation

We have used a low power 10-bit, 10 MS/s monotonic successive approximation register ADC for implementing the ADC block. The proposed ADC structure presented in a study by Yousefirad et al.,[20] and we just improved the performance of the proposed ADC with a small modification in the transistors level. [Figure 18] shows the structure of the proposed ADC. In the ADC block, a monotonic capacitor switching employed in core design lowers the switching energy and simplifies the switch structure. In addition, the double supply scheme has been used to reduce the power consumption of the digital circuits section. Reducing the digital power supply by half resulted in a considerable reduction in power consumption of the digital parts. A ¼ improvement is achieved because of the quadratic relationship between supply and power consumption. Another ½ improvement is caused by the nonlinear voltage-dependent behavior of device capacitances at the internal nodes of the digital circuits. The power management scheme results in an overall 10% improvement in power consumption in comparison to the design that only uses a single supply. [Table 2] shows the main specification of the proposed ADC.
Figure 18: The proposed analog to digital converter structure

Click here to view
Table 2: Main specifications of the proposed analog to digital converter

Click here to view



  Conclusion Top


We have described the architecture of a new custom smart CIS in this article. The proposed architecture is useful for low power WCE. The proposed new smart CIS includes a 256 × 256 pixels array with a new on-chip ANFIS that has been used to diagnosing bleeding images of the GI. Data transmitting is reduced because of on-chip examination of the captured images. [Figure 19] shows the chip layout of the proposed smart CIS without pads. All of the circuits and blocks were implemented in the commercial TSMC 0.18 μm CMOS 1P6M RF Process. There was a 3% implementation error between the design and postlayout simulation of the implemented circuits. Hence, we decided to decrease the comparison level of the POB perceptron from 19% to 17% for avoiding missing any bleeding pictures. Hence, the accuracy of the bleeding detection system decreases to 92% from 97%. Please consider that the error in the proposed system means that the nonbloody pictures detected as bleeding pictures and send to the RF section. So with this modification, we guaranteed that all of the bleeding images will send, and the system never missed any bleeding pictures and just sends some nonbloody pictures as mistakes.
Figure 19: Layout of the proposed smart CMOS image sensor

Click here to view


As we illustrated in section II, the main advantage of the proposed smart CIS is low power consumption. If we tune the picture captured speed in 2fps as the same as a commercial WCE (for taking totally about 72000 images in 10 h), we will have just 0.61 mW power consumption in the proposed smart CIS. [Figure 20] shows the power consumption of the proposed smart CIS versus time in the state mode 1. In fact, the mode 1 is the normal working state of the smart CIS that the detecting system looked for the bleeding pictures. For more than 99% of the time (depends on numbers of the bleeding images and the abnormalities area), smart CIS is in the state of the mode 1, and if it finds the bleeding picture, it will change the mode to the second state for composing and sending pictures. Please remind that the RF transmitter is off in the state mode 1 and its power consumption is near to zero. As shown in [Figure 20], during the integration period, the smart CIS consumes 1.25 W for 100 μs and then for the next 45 μs image processing blocks is worked with 650 mW power consumption. The power dissipation decreases to 0.3 mW for the remained time of the period. Hence the average power consumption in each period (500 ms) is 0.61 mW. [Table 3] shows a brief comparison between the power consumption of the proposed smart CIS and other works. A commercial WCE has power consumption about 10 mW. In a study by Itoh et al.,[21] a one-chip camera device with low-power digital data transmission function for WCE was presented. Presented SoC has QVGA resolution with 2 fps transmitting rate. Measured power consumptions on-chip are 950 μW, 250 μW, and 1400 μW for analog, digital, and RF transmitter sections, respectively. In a study by Zhang et al.,[22] a low power full-custom CMOS digital pixel sensor array for WCE with 2 fps transmitting rate was presented. The on-chip power consumption was presented by Zhang et al.,[22] are 1.4 mW for CIS and 2.2 mW for JPEG-LS encoder.
Figure 20: Power consumption of the proposed smart CMOS image sensor in the mode 1

Click here to view
Table 3: A brief power consumption comparison of wireless capsule endoscopys

Click here to view


Financial support and sponsorship

None.

Conflicts of interest

There are no conflicts of interest.


  Biographies Top




Peiman Aliparast was born in Tabriz, Iran. He received the M.Sc. degree from Urmia University, Urmia, Iran, in 2007 and Ph.D. degree from University of Tabriz, Tabriz, Iran, in 2012 both in Electronics Engineering. From 2004 to 2008, he was with the Microelectronics Research Laboratory in Urmia University, Urmia, Iran and from 2008 to 2012, he was a research assistant in Integrated Circuits Research Laboratory, University of Tabriz, Tabriz, Iran. He is director of the Microsystems lab. currently and assistant professor in Aerospace Research Institute (Ministry of Science, Research and Technology), Tehran, Iran. His research interests are smart CMOS image sensors for biomedical applications, RFIC and MMIC for space telecommunication systems, analog and digital integrated circuit design for fuzzy and neural network applications, analog integrated filter design and high-speed high-resolution digital to analog converters. He is member of Iran Microelectronics Association (IMA).

Email: [email protected]



 
  References Top

1.
Eliakim R. Video capsule endoscopy of the small bowel. Curr Opin Gastroenterol 2008;24:159-63.  Back to cited text no. 1
    
2.
McCaffrey C, Chevalerias O, Mathuna C, Twomey K. Swallow able-capsule technology. IEEE M PVC 2008;7:23-9.  Back to cited text no. 2
    
3.
Available from: http://aigindia.net/capsuleendoscopy.html. [Last accessed on 2020 Feb 16].  Back to cited text no. 3
    
4.
Iddan G, Meron G, Glukhovsky A, Swain P. Wireless capsule endoscopy. Nature 2000;405:417.  Back to cited text no. 4
    
5.
Valdastri P, Webster R, Quaglia C, Quirini M, Menciassi A, Dario P. A new mechanism for mesoscale legged locomotion in compliant tubular environments. IEEE J RO 2009;25:1-11.  Back to cited text no. 5
    
6.
Moglia A, Menciassi A, Dario P, Cuschieri A. Capsule endoscopy: Progress update and challenges ahead. Nat Rev Gastroenterol Hepatol 2009;6:353-62.  Back to cited text no. 6
    
7.
Nakamura T, Terano A. Capsule endoscopy: Past, present, and future. J Gastroenterol 2008;43:93-9.  Back to cited text no. 7
    
8.
Canlas KR, Dobozi BM, Lin S, Smith AD, Rockey DC, Muir AJ, et al. Using capsule endoscopy to identify GI tract lesions in cirrhotic patients with portal hypertension and chronic anemia. J Clin Gastroenterol 2008;42:844-8.  Back to cited text no. 8
    
9.
Liangpunsakul S, Chadalawada V, Rex DK, Maglinte D, Lappas J. Wireless capsule endoscopy detects small bowel ulcers in patients with normal results from state of the art enteroclysis. Am J Gastroenterol 2003;98:1295-8.  Back to cited text no. 9
    
10.
Jang SR. Antis: Adaptive-networks-based fuzzy logic inference system. IEEE Trans Syst Man Cybernet SMC-23, 1993:665-85.  Back to cited text no. 10
    
11.
Keuchel M, Hagenmüller F, Fleischer DE. Atlas of Video Capsule Endoscopy. 1st ed. Heidelberg: Springer; 2006.  Back to cited text no. 11
    
12.
Feldman M, Schiller LR, editors. Series: Gastroenterology and Hepatology. Current Medicine. 1st ed., Vol. 7. Amazon.com, Inc.; 1997. p. 244.  Back to cited text no. 12
    
13.
Hara AK, Leighton JA, Sharma VK, Fleischer DE. Small bowel: Preliminary comparison of capsule endoscopy with barium study and CT. Radiology 2004;230:260-5.  Back to cited text no. 13
    
14.
Hara AK, Leighton JA, Sharma VK, Heigh RI, Fleischer DE. Imaging of small bowel disease: Comparison of capsule endoscopy, standard endoscopy, barium examination, and CT. Radiographics 2005;25:697-711.  Back to cited text no. 14
    
15.
Al-Rahayfeh AA, Abuzneid AA. Detection of bleeding in wireless capsule endoscopy images using range ratio color. Int J Multimed Appl 2010;2:1-10.  Back to cited text no. 15
    
16.
Karargyris A, Bourbakis N. A Methodology for Detecting Blood-Based Abnormalities in Wireless Capsule Endoscopy videos, 8th IEEE International Conference on BioInformatics and BioEngineering, 2008, BIBE; 2008.  Back to cited text no. 16
    
17.
Kodogiannis VS, Boulougourab M, Lygourasc JN, Petrounias I. A neuro-fuzzy-based system for detecting abnormal patterns in wireless-capsule endoscopic images. J Neurocomputing 2007;70:704-17.  Back to cited text no. 17
    
18.
Aliparast P, Koozehkanai ZD. A current mode active pixel with high sensitivity pinned PD in standard CMOS process for smart image sensors. J Microelectron 2013;44:1208-14.  Back to cited text no. 18
    
19.
Aliparast P, Khoei A, Hadidi KH. A novel adaptive fully-differential gm-c filter, tuneable with a cmos fuzzy logic controller for automatic channel equalization after digital transmissions. AEU Int J Electron Commun 2009;63:374-86.  Back to cited text no. 19
    
20.
Yousefirad S, Nasirzadeh Azizkandi N, Aliparast P. A 10-BIT, 10 MS/S Double Supply Low Power Analog to Digital Converter with Monotonic Switching for Wireless. Baku, Azerbaijan: The 10th International Conference on Technical and Physical Problems of Electrical Engineering; 2014.  Back to cited text no. 20
    
21.
Itoh S, Kawahito S, Terakawa S. A 2.6mW 2fps QVGA CMOS One-Chip Wireless Camera with Digital Image Transmission Function for Capsule Endoscopes. IEEE International Symposium on Circuits and Systems; 2006.  Back to cited text no. 21
    
22.
Zhang M, Bermak A, Li X, Wang Z. A Low Power CMOS Image Sensor Design for Wireless Endoscopy Capsule. IEEE Biomedical Circuits and Systems Conference; 2008.  Back to cited text no. 22
    


    Figures

  [Figure 1], [Figure 2], [Figure 3], [Figure 4], [Figure 5], [Figure 6], [Figure 7], [Figure 8], [Figure 9], [Figure 10], [Figure 11], [Figure 12], [Figure 13], [Figure 14], [Figure 15], [Figure 16], [Figure 17], [Figure 18], [Figure 19], [Figure 20]
 
 
    Tables

  [Table 1], [Table 2], [Table 3]



 

Top
 
 
  Search
 
Similar in PUBMED
 Related articles
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
   Abstract
  Introduction
   The Proposed Sys...
   Adaptive Neuro-F...
   Circuit Design a...
  Conclusion
  Biographies
   References
   Article Figures
   Article Tables

 Article Access Statistics
    Viewed159    
    Printed4    
    Emailed0    
    PDF Downloaded19    
    Comments [Add]    

Recommend this journal