I am a senior undergraduate student of University of Michigan majoring in Computer Science and graduated in April, 2018.
- Depth in the Wild : A CNN model that achieves single-image depth perception using relative-depth
- MapReduce Server : A multi-process, multi-threaded server that executes user-submitted MapReduce jobs
- Sentiment Classification : Using deep learning to classify facial expression
- wiki-search : A full-stack web search engine for wikis
Depth in the Wild
Depth from a single RGB image is a fundamental problem in vision. Recent years have seen rapid progress thanks to data-driven methods, in particular, deep neural networks trained on large RGB-D datasets. But such advances have yet to broadly impact higher-level tasks. One reason is that many higher-level tasks must operate on images "in the wild", images taken with no constraints on cameras, locations, scenes, and objects, but the RGB-D datasets used to train and evaluate image-to-depth systems are constrained in one way or another. Our model made the construction of future data-sets for depth perception easier with crowd-sourcing instead of taking RGB-D photos. I implemented the "hourglass" neural network with PyTorch and trained it with hundreds of thousands of data-points on the FLUX server cluster. Tested and compared our model to the previous state-of-art, and our algorithm is performs better in the ordinal relation metric, with a weighted disagreement rate of 28.3%.
A single machine, multi-process, multi-threaded server that will execute user-submitted MapReduce jobs. It will run each job to completion, handling failures along the way, and write the output of the job to a given directory. You will be able to run any MapReduce job on your machine with this server application. We built a server application that supports universal map-reduce programs and automatic parallelization & distribution. Implemented the 'master' server to split a computationally-heavy task to parts and distributes them to several servers to complete. Used TCP+UDP to make the system highly robust and fault-tolerant. And tested it upon various kinds of map-reduce jobs.
Sentiment can be expressed in many ways. In this project, we will classify sentiment conveyed through images of facial expressions. We will do this in three stages of increasing sophistication. Each method will begin by selecting features, then proceed to learning a mapping between the feature space and the label space. Here, the input data from which we will extract features consists of images of faces as seen below. Each label pertains to one of seven sentiments: Anger, Disgust, Fear, Happiness, Sadness, Surprise, and Neutral. We used a modern deep learning framework: TensorFlow, to build and train the convolutional neural network. Implemented an architecture constitutes of multiple convolutional layers, activation layers, pooling layers, and fully connected layers. Tested the accuracy of the model, and it outperforms k-means clustering and auto-encoder trained on the same dataset.
This is a scalable search engine that resembles commercial systems, it indexes scraped web data and implements both the front end and back end of the search-engine. I Implemented Indexing with MapReduce (using hadoop) so it can scale to large corpus sizes. Built the Information retrieval process based on both term frequency-inverse document frequency and PageRank scores. Made a search engine interface front end with two special features: user-driven scoring and links to similar documents.