• Users Online: 82
  • Print this page
  • Email this page
Export selected to
Endnote
Reference Manager
Procite
Medlars Format
RefWorks Format
BibTex Format
  Citation statistics : Table of Contents
   2011| April-June  | Volume 1 | Issue 1  
    Online since September 23, 2019

 
 
  Archives   Next Issue   Most popular articles   Most cited articles
 
Hide all abstracts  Show selected abstracts  Export selected to
  Cited Viewed PDF
REVIEW ARTICLES
An overview of randomization and minimization programs for randomized clinical trials
Mahmoud Saghaei
April-June 2011, 1(1):55-61
DOI:10.4103/2228-7477.83520  PMID:22606659
Randomization is an essential component of sound clinical trials, which prevents selection biases and helps in blinding the allocations. Randomization is a process by which subsequent subjects are enrolled into trial groups only by chance, which essentially eliminates selection biases. A serious consequence of randomization is severe imbalance among the treatment groups with respect to some prognostic factors, which invalidate the trial results or necessitate complex and usually unreliable secondary analysis to eradicate the source of imbalances. Minimization on the other hand tends to allocate in such a way as to minimize the differences among groups, with respect to prognostic factors. Pure minimization is therefore completely deterministic, that is, one can predict the allocation of the next subject by knowing the factor levels of a previously enrolled subject and having the properties of the next subject. To eliminate the predictability of randomization, it is necessary to include some elements of randomness in the minimization algorithms. In this article brief descriptions of randomization and minimization are presented followed by introducing selected randomization and minimization programs.
  59 1,766 141
Review of fast Monte Carlo codes for dose calculation in radiation therapy treatment planning
Keyvan Jabbari
April-June 2011, 1(1):73-86
DOI:10.4103/2228-7477.83522  PMID:22606661
An important requirement in radiation therapy is a fast and accurate treatment planning system. This system, using computed tomography (CT) data, direction, and characteristics of the beam, calculates the dose at all points of the patient's volume. The two main factors in treatment planning system are accuracy and speed. According to these factors, various generations of treatment planning systems are developed. This article is a review of the Fast Monte Carlo treatment planning algorithms, which are accurate and fast at the same time. The Monte Carlo techniques are based on the transport of each individual particle (e.g., photon or electron) in the tissue. The transport of the particle is done using the physics of the interaction of the particles with matter. Other techniques transport the particles as a group. For a typical dose calculation in radiation therapy the code has to transport several millions particles, which take a few hours, therefore, the Monte Carlo techniques are accurate, but slow for clinical use. In recent years, with the development of the 'fast' Monte Carlo systems, one is able to perform dose calculation in a reasonable time for clinical use. The acceptable time for dose calculation is in the range of one minute. There is currently a growing interest in the fast Monte Carlo treatment planning systems and there are many commercial treatment planning systems that perform dose calculation in radiation therapy based on the Monte Carlo technique.
  31 1,261 122
A review of coronary vessel segmentation algorithms
Maryam Taghizadeh Dehkordi, Saeed Sadri, Alimohamad Doosthoseini
April-June 2011, 1(1):49-54
DOI:10.4103/2228-7477.83519  PMID:22606658
  13 943 78
A brief survey of computational models of normal and epileptic EEG signals: A guideline to model-based seizure prediction
Farzaneh Shayegh, Rasoul AmirFattahi, Saeid Sadri, Karim Ansari-Asl
April-June 2011, 1(1):62-72
DOI:10.4103/2228-7477.83521  PMID:22606660
In recent decades, seizure prediction has caused a lot of research in both signal processing and the neuroscience field. The researches have tried to enhance the conventional seizure prediction algorithms such that the rate of the false alarms be appropriately small, so that seizures can be predicted according to clinical standards. To date, none of the proposed algorithms have been sufficiently adequate. In this article we show that in considering the mechanism of the generation of seizures, the prediction results may be improved. For this purpose, an algorithm based on the identification of the parameters of a physiological model of seizures is introduced. Some models of electroencephalographic (EEG) signals that can also be potentially considered as models of seizure and some developed seizure models are reviewed. As an example the model of depth-EEG signals, proposed by Wendling, is studied and is shown to be a suitable model.
  8 1,017 62
ORIGINAL ARTICLES
CBMIR: Content-based image retrieval algorithm for medical image databases
Abdol Hamid Pilevar
April-June 2011, 1(1):12-18
DOI:10.4103/2228-7477.83460  PMID:22606654
We propose a novel algorithm for the retrieval of images from medical image databases by content. The aim of this article is to present a content-based retrieval algorithm that is robust to scaling, with translation of objects within an image. For the best result and efficient representation and retrieval of medical images, attention is focused on the methodology, and the content of medical images is represented by the regions and relationships between such objects or regions of the Image Attributes (IA) of the objects. The CBMIR employs a new model in which each image is first decomposed into regions. The similarity measurement between images is developed based on a scheme that integrates the properties of all the regions in the images using regional matching. The method can answer queries by example. The efficiency and performance of the presented method has been evaluated using a dataset of about 5,000 simulated, but realistic computed tomography and magnetic resonance images, from which the original images are selected from three large medical image databases. The results of our experiments show more than a 93 percent success rate, which is satisfactory.
  4 835 35
A cellular automata-based model for simulating restitution property in a single heart cell
Seyed Hojjat Sabzpoushan, Fateme Pourhasanzade
April-June 2011, 1(1):19-23
DOI:10.4103/2228-7477.83517  PMID:22606655
Ventricular fibrillation is the cause of the most sudden mortalities. Restitution is one of the specific properties of ventricular cell. The recent findings have clearly proved the correlation between the slope of restitution curve with ventricular fibrillation. This; therefore, mandates the modeling of cellular restitution to gain high importance. A cellular automaton is a powerful tool for simulating complex phenomena in a simple language. A cellular automaton is a lattice of cells where the behavior of each cell is determined by the behavior of its neighboring cells as well as the automata rule. In this paper, a simple model is depicted for the simulation of the property of restitution in a single cardiac cell using cellular automata. At first, two state variables; action potential and recovery are introduced in the automata model. In second, automata rule is determined and then recovery variable is defined in such a way so that the restitution is developed. In order to evaluate the proposed model, the generated restitution curve in our study is compared with the restitution curves from the experimental findings of valid sources. Our findings indicate that the presented model is not only capable of simulating restitution in cardiac cell, but also possesses the capability of regulating the restitution curve.
  3 668 47
Volumetric medical image coding: An object-based, lossy-to-lossless and fully scalable approach
Habibiollah Danyali, Alfred Mertins
April-June 2011, 1(1):1-11
DOI:10.4103/2228-7477.83504  PMID:22606653
In this article, an object-based, highly scalable, lossy-to-lossless 3D wavelet coding approach for volumetric medical image data (e.g., magnetic resonance (MR) and computed tomography (CT)) is proposed. The new method, called 3DOBHS-SPIHT, is based on the well-known set partitioning in the hierarchical trees (SPIHT) algorithm and supports both quality and resolution scalability. The 3D input data is grouped into groups of slices (GOS) and each GOS is encoded and decoded as a separate unit. The symmetric tree definition of the original 3DSPIHT is improved by introducing a new asymmetric tree structure. While preserving the compression efficiency, the new tree structure allows for a small size of each GOS, which not only reduces memory consumption during the encoding and decoding processes, but also facilitates more efficient random access to certain segments of slices. To achieve more compression efficiency, the algorithm only encodes the main object of interest in each 3D data set, which can have any arbitrary shape, and ignores the unnecessary background. The experimental results on some MR data sets show the good performance of the 3DOBHS-SPIHT algorithm for multi-resolution lossy-to-lossless coding. The compression efficiency, full scalability, and object-based features of the proposed approach, beside its lossy-to-lossless coding support, make it a very attractive candidate for volumetric medical image information archiving and transmission applications.
  2 744 43
Fuzzy logic controller for hemodialysis machine based on human body model
Vahid Reza Nafisi, Manouchehr Eghbal, Mohammad Reza Jahed Motlagh, Fatemeh Yavari
April-June 2011, 1(1):36-48
DOI:10.4103/2228-7477.83505  PMID:22606657
Fuzzy controllers are being used in various control schemes. The aim of this study is to adjust the hemodialysis machine parameters by utilizing a fuzzy logic controller (FLC) so that patient's hemodynamic condition remains stable during hemodialysis treatment. For this purpose, a comprehensive mathematical model of the arterial pressure response during hemodialysis, including hemodynamic, osmotic, and regulatory phenomena has been used. The multi-input multi-output (MIMO) fuzzy logic controller receives three parameters from the model (heart rate, arterial blood pressure, and relative blood volume) as input. According to the changes in the controller input values and its rule base, the outputs change so that the patient's hemodynamic condition remains stable. The results of the simulations illustrate that applying the controller can improve the stability of a patient's hemodynamic condition during hemodialysis treatment and it also decreases the treatment time. Furthermore, by using fuzzy logic, there is no need to have prior knowledge about the system under control and the FLC is compatible with different patients.
  1 765 45
A novel method for trajectory planning of cooperative mobile manipulators
Hossein Bolandi, Amir Farhad Ehyaei
April-June 2011, 1(1):24-35
DOI:10.4103/2228-7477.83518  PMID:22606656
We have designed a two-stage scheme to consider the trajectory planning problem of two mobile manipulators for cooperative transportation of a rigid body in the presence of static obstacles. In the first stage, with regard to the static obstacles, we develop a method that searches the workspace for the shortest possible path between the start and goal configurations, by constructing a graph on a portion of the configuration space that satisfies the collision and closure constraints. The final stage is to calculate a sequence of time-optimal trajectories to go between the consecutive points of the path, with regard to the nonholonomic constraints and the maximum allowed joint accelerations. This approach allows geometric constraints such as joint limits and closed-chain constraints, along with differential constraints such as nonholonomic velocity constraints and acceleration limits, to be incorporated into the planning scheme. The simulation results illustrate the effectiveness of the proposed method.
  - 729 52