Relevant Projects
Algorithmic Development of Volume Segmentation and Analysis of Tumors From Radiological Images
This was a Masters research project. [Spring 2014]
We developed a new method for volume segmentation, analysis of the tumor volume, and development of GUI
interface in this project. The active contour model is used in segmentation. The algorithm developed
by us finds the coarse boundaries of tumor using a segmentation algorithm, and parameterized snake algorithm
is used to refine those boundaries. We also compute the histogram analysis of the region, the histogram
analysis of the result after finding the boundaries using our algorithm, the time taken by our algorithm
in finding the ROI, and the analysis of spatial histogram of the region in this project.
This Project was carried out at Johns Hopkins School of Medicine
under the guidance of Dr. Michael Jacobs and
Dr. Vladimir Braverman.
A detailed report on this project can be found here.
Implementing Haar wavelet transform for large datasets
This was a course project for EN.600.423.Data Intensive Computing.[Fall 2013]
Project members: Vaibhav Mohan, Ethan Holly, Matt Small, and Dan Swann
We implemented a function that converted a 2D matrix into a wavelet encoded matrix, a thresholding
function that discards values below given threshold, and a function that decodes wavelet encoded
matrix into full version of matrix. We tested this implementation on 2GB, 12GB, and 56 GB datasets.
The code that we have written for this project along with all other scripts can be found
here. For more
details on how we have implemented this project and designed the experiment, please refer to my
wiki.
Here is our presentation on this project.
Benchmarking a NoSQL cluster - Cassandra
This was a course project for EN.600.423.Data Intensive Computing.[Fall 2013]
Project members: Vaibhav Mohan, Nick Carey, Ethan Holly, and Shaojie Chen
We configured and deployed Cassandra cluster of 1,2,4, and 8 nodes on Amazon EC2 M1
large instances. We measured throughput and operations per second as a function of the size
of the read or write. In order to achieve this, we varied the size and number of read/write
operations on a single node cluster. We also measured the scalability of clusters by varying
number of clients accessing the cluster, and number of threads accessing the cluster from one
client. We then varied the number of nodes in cluster for each setting and plotted the results
on graph. The graphs can be found here. The code
that we have written for this project along with all other scripts can be found
here. Here is our
presentation on this project. For more
details on how we have implemented this project and designed the experiment, please refer to my
wiki.
Approximating the number of energy wells in given protein molecule
This was a summer research project.[Summer 2013]
In this project, I used CHARMM script for finding the energy value of protein molecule corresponding to the
given conformation of angles. Python's gearman module was used in parallelizing the computation of energy
values and based on the energy values corresponding to given conformation, I tried approximating the number
of energy wells present in the given protein molecule.
This Project was carried out at Johns Hopkins University
under the guidance of Prof. Yanif Ahmad.
Comparison of various algorithms for building decision trees.
This was a course project for EN.600.435.Artificial Intelligence.[Spring 2013]
In this project, The Decision tree was implemented using traditional ID3 algorithm as well as an evolutionary algorithm for
learning decision trees in this project. The Traditional Algorithm for learning decision trees was implemented using information
gain as well as using gain ratio. Each variant was also modified to combat over-fitting using pruning. The Evolutionary Algorithm
was implemented with fitness proportionate and rank based as their selection strategy. The algorithm
was also implemented to have complete replacement and elitism as replacement strategy. The two algorithms are compared based
on their accuracy, precision and recall by varying the aforementioned parameters on the datasets taken from
UCI Machine Learning repository. The time taken
for learning the Decision Tree by each algorithm corresponding to each setting was also compared in this project.
This Project was carried out at Johns Hopkins University under the guidance of Prof. Benjamin Mitchell. A detailed
paper on this project can be found here.
A comparison of various reinforcement learning algorithms to solve racetrack problem
This was a course project for EN.600.435.Artificial Intelligence.[Spring 2013]
The racetrack problem is a standard control problem and can be solved using various reinforcement learning techniques.
In this project, the implementation of value iteration algorithm, Q-learning algorithm, and a variant of Q-learning algorithm
that uses function approximator is done in order to solve the racetrack problem. The algorithms are compared based on the score
they obtain in solving the problem on various maps by varying various parameters such as convergence tolerance, number of
iterations etc. This Project was carried out at Johns Hopkins University under the guidance of Prof. Benjamin Mitchell.
A detailed paper on this project can be found here.
Three's Company
This was a course project for EN.600.420.Parallel Programming.[Spring 2013]
This project was implemeted to identify threesomes of users that are mutual friends in a social network, i.e.
A friends with B, B friends with C, and C friends with A. The output enumerated the mutual friends for each user and
avoided duplicate entries. This Project was carried out at Johns Hopkins University under the guidance of Prof. Randal Burns.
This project was implemented using Apache Hadoop..
Cellular Automata in MPI - Implementing Conway's game of life using MPI
This was a course project for EN.600.420.Parallel Programming.[Spring 2013]
This project was implemeted to simulate Conway's game of life using 16x16 grid. MPI was used for implementing this project.
This Project was carried out at Johns Hopkins University under the guidance of Prof. Randal Burns.
Comparison of various classification models for making financial decisions.
This was a course project for EN.600.475.Machine Learning.[Fall 2012]
In this project, various Machine Learning classifiers such as Winnow predictor, Perceptron predictor and Neural Networks predictor
models was implemented for predicting whether somebody will experience financial distress in the next two years by looking on
various parameters like monthly income, age, number of open credit lines and loans etc. A comparison of accuracy of implemented
models is also done in this project. This Project was carried out at Johns Hopkins University under the guidance of Prof. Mark Dredze.
A detailed report of this project can be found here.
Clustering Algorithm for large datasets on GPU.
This was a course project for EN.600.615.Big Data, Small Languages, Scalable Systems.[Fall 2012]
In this project, we had large datasets of protein structures corresponding to each timestamp. The k-means clustering algorithm
was implemented on GPU and was used in clustering the data according to their different energy levels. NVIDIA's massively
parallel platform, CUDA was used for implementing the algorithm on GPU. This Project was carried out at Johns Hopkins University under the guidance of Prof. Yanif Ahmad.
Programming model for GPGPU for various algorithmic problems.
This was a final year undergraduate project.[Spring 2012]
In this project, an implementation of RSA algorithm and Dense Matrix-Matrix multiplication on both CPU as well as GPU was done.
The code was written in CUDA C for GPU implementations and C++ for CPU implementations. A comparison of execution time taken by
both algorithm on both platforms was also done. This Project was carried out at VIT University under the guidance of Prof. Balaraman S.
A detailed report of this project can be found here.
Data regression with Normal Equation on GPU using CUDA.
This was a course project for Soft Computing[Fall 2011]
Project Partner : Mayank Gupta
In this project, we implemented Normal Equation algorithm on Graphics Processing Unit (GPU) using CUDA as well as on CPU.
The performance of both the implementations was compared and found that GPU provides more performance boost and takes lesser
time as compared to CPU in doing computations. This project resulted in a publication. This Project was carried out at VIT University.
The publication resulted by this project can be found here.
Singer’s Information Website.
This was a course project for Web Technologies[Spring 2010]
Project Partner : Vishwas Singh Chauhan
The Singer’s Information website was made using HTML, JavaScript as client side scripting, JSP as server side scripting and Oracle 10g as database.
This website contained information related to different singers. It also had some suggestions for the top songs of the singers.
The user could partially stream songs and videos related to particular singer and could buy it after creating an account on the
website. This Project was carried out at VIT University under the guidance of
Prof. Vanmathi S.
Online Bank System Website.
This was a course project for Web Technologies[Spring 2010]
Project Partner : Vishwas Singh Chauhan
The Online Bank system website was made HTML, JavaScript as client side scripting, PHP as server side scripting and MySQL as database.
This website had basic functionalities like opening an account, depositing money in an account, withdrawing money from account etc. The
website had different user levels such as clients (the ones who opened account), and administrators (they had full access to every database
and could create, delete an account and view the information related to every client). This Project was carried out at VIT University under the guidance of
Prof. Vanmathi S.
Scientific Calculator Application for Android OS.
This was a mini project [Fall 2010]
In this project, a Scientific Calculator Application for Android OS using Android SDK v1.5 was implemented. Here eclipse was
used as IDE and code was written in JAVA. This application extended the functionality of basic calculator operations. It had
various other functions such as finding sines, cosines, logarithm etc. This Project was carried out at VIT University under the guidance of
Prof. Kumaresan P.
Environmental Impact Assessment of Athirapally, Kerala, India.
This was a course project for Atmospheric Processes and Climate Change[Fall 2011]
Project Partners : Vishal Narayan and Kasthuri Prakash
In this project, we designed a vulnerability assessment index for one of the country’s most endangered species reservoir,
an index that is first of its kind in India. We analyzed about 50 parameters related to a place Athirapally, India and based
on the analysis from all the parameters, we calculated the Environment Vulnerability Index of that place. This Project was
carried out at VIT University.
The analysis of the parameters can be found here.