Project Proposals

Send me an email after all the presentations have been made in class.
Send the email to with title "COMP 523 PROJECT PICKS".
Email is due Wed afternoon, 1/14/2015 by 5pm.

In this email give me the LETTERS (from the list below) of your top 6 project choices...
something like this: 1Q 2L 3H 4P 5C 6A

I will do what I can to maximize happiness but I give no guarantees other than you will end up assigned to some project. Think of it this way... every project here is awesome! So you will be happy no matter which you work on.

You may self-select a team if you wish... simply send me an email with all student names as a team along with your 6 project preferences. If you do not self-select a team, I will assign you individually to a team based on project preference matches as best I can. You may self-select a partial team as well.


Insect Data Visualiztion
Allen Hurlbert, UNC Biology

I am a faculty member in the Biology Department conducting research 
on global ecology. We live in an era of dramatic environmental 
change, with habitat modification and climate change going on at 
local, regional, and global scales. 

Crowdsourcing information about the natural environment (leading 
to visualizations like  this one from eBird ) has become an 
increasingly powerful way to understand how ecological systems 
are changing at large spatial scales. We are launching a new 
citizen science program called Caterpillars Count! which will help 
document how the timing of emergence of caterpillars and other 
insects varies across North America and with climate change. 
Changes experienced by insects will have large consequences for 
virtually all other members of the ecosystem. 

Last semester COMP 523 students developed mobile apps (both iOS and 
Android) for simple data entry in the field, and a basic backend database 
for storing those observations. I am now seeking students who would 
be interested in developing online data visualization tools so that 
scientists, educators and students can query these data and evaluate 
scientific questions related to the distribution, abundance, and 
seasonal timing of these insects.  

A simple example is in the  interactive map here.


Big Data Analysis in Cheminformatics
Olexandr Isayev, UNC School of Pharmacy

I work at the School of Pharmacy in the field of cheminformatics. 
We have several very appealing problems for software engineering 
projects.  Those are real world scientific problems with good potential 
for peer-reviewed publication in case of success.  

- GPU accelerated similarity calculations and processing "BigData" 
  for chemistry.
- Locality-sensitive hashing (LSH) for search in extremely large 
  chemical data sets.
- improving algorithm/performance of kNN machine learning algorithm.

Document with details.


Blip for Android
Michael Zulauf, Blip Inc.

We are set to launch a person to person location sharing application
for the iPhone within the next few weeks, but we are also actively
searching for a team to help us build a version of the app for
Android. Blip allows users to share and receive real-time locations
for a set period of time in a safe and fun way. We have all of the
designs and will have a working iOS model within 3 weeks. Our goal is
acquire a large iPhone user-base in Chapel Hill over the next 8 months
before seeking further funding to expand and ultimately introducing an
Andriod version as well.

Going out as a college person is fun. But location sharing and other 
recent innovations have created the potential to bring that fun 
to a whole new level. Blip is a social app aimed at combining new 
technology and design in a simple way to make going out more fun for
college folks.

Blip was founded in 2014 by Ricky McMahon and Michael Zulauf. 
Ricky is a '12 UNC alum and current graduate student at 
UNC Kenan-Flagler.  Michael is '12 UNC-Wilmington alum and 
viral marketing aficionado. In their spare time on the weekends, 
Ricky and Michael can be found at Top of the Hill doing market 
research for Blip.

With the launch of the iOS application drawing near, Blip is looking 
for a skilled, dedicated, enthusiastic team to kick-start development 
of an Android version of the app. Recommended (but not necessary) 
skills include a solid foundation in Java, exposure to XML, 
experience with android and/or app development, experience with
interface design.


New BS/MS Application Program 
Jodie Turnbull, Computer Science (consultant: Diane Pozefsky)

As the BS/MS program has grown, the quick and dirty application that 
we developed to manage it is no longer adequate.  We need an application 
that allows structured recommendations from faculty and pulling information 
from ConnectCarolina.  It needs to put the information into a database 
(currently it's a flat file) and produce easy to read reports -- both 
of individual students and program status.  The database will become 
the basis for tracking the admitted students' progress and our program 
assessments.  The application also needs to support a simple workflow 
to assure that people (students and managers) are notified when 
information is needed.  


Open-source implementation of ShortStraw sketch segmenter on iOS/Android
David Thompson, Kitware (Carrboro)

We are developing some simulation pre-processing software and would like 
users to be able to sketch some 2-D geometry to be extruded or revolved 
into a solid model. The project would focus on accepting a sketch from 
a person using a tablet or phone and segmenting the raw sketch points 
into distinct curves.

A lot of research has been done on this problem (called sketch segmentation) 
and one of the best algorithms is named ShortStraw[1]. It is also relatively 
simple. We would like (eventually, though not necessarily part of this project) 
to use it to display gesture completions for straight lines and simple curves 
as a person is sketching[2]. That means the implementation must be fast and 
robust enough to run as an on-line algorithm. The tasks involved in the project 
would be to 
(1) implement input and rendering of finger drawings, 
(2) implement the segmentation algorithm, 
(3) call functions to pass each segment's points on to the solid modeler, and 
(4) add interactivity that displays corners as they are found -- allowing the user 
to modify them before the segments are passed on to the modeler.

The project should be written in C++ so we can use it easily from within 
our existing software, although a very fast JavaScript+WebGL version might 
also be acceptable.



Self guided walking tour of the Sitterson/Brooks
Melissa Wood  (Comp Sci)          

Have you ever visited another department and not had anyone to show 
you around or been in a building looking at a display and know that 
there is much more to the story than the 3x5 card associated with it 
can present? Have you ever wandered the halls of Sitterson or Brooks 
and wanted to know more about a particular artifact, research group, 
Lab or professor?

Our department has a rich 50 year history and many stories to tell. 
We would like to develop a walking tour of the department that would 
allow a visitor to either swipe their phone on a tag or scan a code 
and be taken to either an audio file or webpage that gives them more 
of the story.  For example scanning the Fred Brook's bust in FB 141 
could give the visitor the opportunity to hear a quote from Dr. Brooks 
in his own voice, scanning a code in the robotics Lab would take 
visitors to a web page discussing the group's current projects, etc.  
Additionally we would want to provide a map that would indicate where 
the artifacts are located in the building.

Similar projects :


Newspaper Archives Analysis and Retrieval for UNC Digital Innovations Lab
Bil Hays, Computer Science

I've been helping the Digital Innovations Lab with some tech issues, 
and they have a newspaper project. PDFs of newspapers, xml files from 
OCRs with header data about the publication and positional data for 
all of the words, and txt files extracted from the xml. Lots of 
variety, lots of bad OCR. The directory tree for these has a serial 
number from the library of congress/microfilm reel 

One possibility would be a database with a web front end, with metadata
about the files in the DB so researchers could download copies of files
based on the searches. Another would be to try to parse the xml to
identify the kinds of text (ads, poems, headlines, article text have
different layouts, so the software might be able to make a guess as to
the type of text a file has. Or, they might be able to work on tools to
help clean up the OCR xml or text files, using grammar checkers or
patterns in the text; there are lots of hyphenated words, or perhaps
just making a decent interface that shows the pdf and the text in a 
form with some basic editing tool. 

And there's the question of whether there are things that could be 
adapted from other kinds of stuff out there. One researcher has adapted
a bird song analyzer to work with audio from interviews in the field, 
and it has been suggested that a spam filter might be adaptable to 
sort article text from non-article text. So there are a number of 
different potential project idea any of which would help the
DIL with a team on this project.



Musical Empowerment Scheduler 
Christina Cheng (Meredith Richard presenting)

Musical Empowerment is a 501(c)(3) non-profit organization that 
currently pairs 120 underserved students in the Chapel Hill
Carrboro community with college student volunteers who teach them 
free 40-minute weekly lessons.  The number of students in our 
program has doubled each year for the past four years. As the number 
of students continues to increase in our organization, it becomes 
more difficult and time-consuming to pair students and teachers 
based on the factors we consider when making pairs. Currently, 
pairings are made manually based on information about applicants 
listed in Excel spreadsheets.  We would like to automate the pairing 
process in order to make student-teacher pairings more efficiently 
and with less human error.  We consider the following factors in 
the pairing process for both students and volunteers:

*Instrument preferences: Teachers indicate the instruments they 
can teach and students indicate the instrument they would like to 
learn.  Instruments offered include piano, violin, viola, cello, 
guitar, flute, clarinet, voice, ukulele, trumpet, and saxophone.

*Time availability: Monday through Friday between 4:20-7:40 
(40-minute lesson intervals). Students and teachers that are paired 
together must have overlapping time availability.

*Language preferences: Many families do not speak English 
fluently, and parents prefer a teacher who can speak the family's 
preferred language.

*Siblings: When families enroll more than one child in the program, 
we try to pair all of their children so one sibling is not left out. 
We also try to set all lessons for the siblings simultaneously or 
back-to-back for the convenience of the parents.

*Level: Teachers indicate which level: beginning, intermediate, or 
advanced student they prefer teaching. Students will also specify 
their level.  Note that students and teachers can also indicate no 

*Gender: Teachers indicate which gender they prefer teaching and 
students specify which gender teachers they prefer. Both teachers 
and students can indicate no preference.


Additions to CS Major Web Portal 
Rate an Internship 
Diane Pozefsky, Computer Science

A previous COMP 523 team produced a CS Majors Web Portal
where information of interest to COMP majors could be
collected, viewed, and commented on.
Some of the information and services provided are
job posting clearinghouse, interview information,
job offer information, tutoring schedules and opportunities, 
study group formation, calendar of events, etc.

This project is to extend and inprove the CS Majors Portal.
We've all seen "Rate a Prof" sites -- multiple versions of them.  
The idea of a "Rate an Internship" is to build a site where students 
can give quick reviews of internships.  The idea is that students who 
have had an internship at a particular company are the best people to 
tell you what the internship was like.  The goal is to build a site that 
is quick and easy to use for students and that can be administered to 
limit it to various groups.  

The first of these should be obvious:  shouldn't take long to fill in 
the information or to find the feedback.  The challenge is to design 
the right characteristics (no chili peppers, please!) -- and to make 
it easy to change them when we get it wrong!  

The second part is that we will want to limit the people that share the 
information but that sharing will change as time goes on.  For example, 
the first roll-out will be UNC CS students.  We then might expand it 
to CS students at a few other schools or to different disciplines at UNC.  
This will require a clean design but should not be functionally difficult.  
It may also limit the choice of development platforms.


Recreate the game Sonic Zoom in Google Chrome using web standards
Gary Bishop, Computer Science

Ten years ago a crack team of UNC undergrads created Sonic Zoom 
for Windows.  It had 3D graphics, great sound, and was accessible 
to people who are visually impaired. Bit rot has set in and it is no 
longer playable.

But now in 2015 web technology has advanced to the point that 
3D graphics and sound are possible in the browser using Javascript. 

The goal of this project is to create a playable game with the look and 
feel of the original but running in Google Chrome.  We have the source 
of the original including the great recorded sound and textures.

Screen image

Sonic Zoom sound



Tool for the Visualization and Analysis of Curvilinear Data
Martin Styner, Computer Science

Project description: For the analysis of data measured along curvilinear 
paths (stemming from diffusion image analysis or shape analysis, 
though there are plenty more applications of this), we have developed 
a matlab toolbox together with UNC biostatisticians. That matlab toolbox 
is currently useable for biostatisticians and knowledgeable computer 
scientists, but not the average researcher. Thus the goal of this project 
is create a cross-platform intuitive graphical user interface that 
allows to do the following:

- I/O for the measurements data and subject data 
- visualization of the subject data as a table
- visualization of the measurements as 2D plots
- parameter setting for the data analysis (including I/O of parameters 
  via XML)
- computation of the data analysis by cross-platform system calls to 
  the matlab toolbox
- visualization of the data analysis as 2D plots

PowerPoint presentation


Making anatomic object geometric correspondence work
Steve Pizer, Computer Science

This project is in the general area of medical image analysis.
A successful projct will be of assistance to medical doctors and 
researchers at UNC and elsewhere.

Document with details.

Powerpoint presentation from class


Patient Education and Injury Prevention through Serious Games
Alberto Bonifacio, UNC Health

Training with Serious Games

Supporting information:

UNC Trauma program


MidAtlantic REC Active Portal
Alberto Bonifacio, UNC Health

RAC Active Web Portal

Supporting information:

UNC Trauma program

Current Trauma Regional Advisory Committee (RAC) page

Current passive May Day information page


Bricks 2.1
David Stotts, Computer Science

We have developed a programming environment called Bricks for 
teaching COMP 110 (Intro Programming) in JavaScript.  
Bricks offers code editing, execution, and instant grading
of students' programs.  
The traditional approach to teaching programming is to have
student watching class lectures, read books, and then write 6 or 7
larger programs.
The operating principle with Bricks is that students will learn 
programming better if they write dozens of smaller programs in class,
with the guidance of the instructor, and instant grading feedback 
on them.

All submissions are saved (correct and incorrect) and can be retrieved 
by a student for each problem.  Bricks has been successfully used to 
support a class of 130 students.  

This project will be to extend Bricks with new capabilities:

-- Bricks currently grades functional correctness well (did the
   code generate the correct output for certain inputs), but needs
   more sophisticated and detailed style grading

-- Bricks supports JavaScript programming currently; we want to 
   extend it to support Java programming.

-- We need a better collection of functions, analyses, and services
   that the instructor can use to examine the progress of students,
   both individually as well as collectively

-- We need the capability to project anonymously the work of an
   individual student for the class to examine and critique.

-- We need other "social" structures to support class interaction in
   the programming process

Bricks webpage


Math Stacks          
Diane Brauner (Visually Impaired & Mobility Instructor).

The Math Stacks team contacted me today about continuing their project.  
They did an amazing job last semester and Im excited that they are 
interested in continuing this project in the Comp 523 class.  They 
have asked to meet with me to discuss the focus for this semester.

FYI:  Here is the intial project description that I wrote last semester:

  Vertical Addition (Math Problems) using accessible Tables

Learning how to correctly line up math problems is spatially challenging 
for braille students.  With the traditional Perkins Braille Writer 
(which looks like a clunky old manual typewriter), lining up the math 
problem in columns, leaving space for carrying/borrowing, then 
inserting numbers in the correct place is tough! 

Math apps (especially for young students learning addition/subtraction
/multiplication/division) are often arranged in a grid or table format 
(rows and columns).  However, these rows/columns are visual lines and 
are typically not provide any hints to students who are visually impaired.  
These students often do not drag their finger in a straight line.  
The student may think that he is dragging straight down to hear the 
numbers in the ones column, but he is actually dragging diagonally to 
the left picking up the number in the tens column by mistake.

When using VoiceOver, iOS devices have a powerful tool called the 
Rotor.  One of the options in the Rotor enables blind users to easily 
navigate tables when the table is correctly formatted in html.  When 
you set the Rotor to Rows, a down flick (or down arrow on the keyboard 
or refreshable braille display) will move down the column and will 
announce the row header and the focused item in that column as you 
move down to the next row.  A right flick or right arrow will move 
to the next column and VoiceOver will read the next column header 
and the focused item in that row.  Tables have to be coded correctly 
to work with VoiceOver.

For an example of how to use the Rotor set to Rows, go to a table 
located  at this link. 

I would like to see an app with beginning vertical math problems 
that is a true table that enables a student to navigate by rows 
or columns. 



The vertical addition problem above would be in a table similar to the 
table below (with the addition sign and equals sign included):

   2 3
   1 4
  __ __

If this vertical addition problem was coded correctly as a table, 
then students can flick or use the arrows to navigate down the ones 
column and VoiceOver would say, "three, ones column". Flick down. 
"Plus four".  Flick down.  "Equals  blank".  When you navigate to 
the tens column, VoiceOver would say "two, tens column".  Flick down.  
"Plus one".  Flick down.  "Equals blank."

The Vertical Math App should include:

* Be set up as an accessible table labeled with ones, tens, 
  hundreds columns.  (Table works correctly with the Rotor set to 
  Rows and then flicks/arrows to navigate.)

* VoiceOver should correctly state the addition sign, equals and "blank".
* One math problem per page.
* Button to check answer. 
* All buttons/touch targets should be standard size (not too small), 
  high contrast (for low vision students) and placed in standard 
  locations.  (I will work with you on how to make the app accessible!)
* Math problems should appear in a random order.

Optional Features:

* Add a level that has the ability to borrow/carry.  (Add boxes at 
  the top of the vertical problem and ability to "cross out" to borrow.)
* Add a level for subtraction problems.
* Add levels for multiplication/division.
* Way to keep track of right/wrong answers.
* Add ability for teachers to create their own math problems (possibly 
  selecting the numbers that will be put into your table?)


Games for Visually Impaired People
Diane Brauner (Visually Impaired & Mobility Instructor).

See this list.

One more idea to add comes from Ed Summers, who leads the 
accessibility team at SAS.  Ed is visually impaired and would 
personally like to see this happen:

Would like for bus stops to be organized by routes with GPS 
information coded in way that can be accessed in BlindSquare.  
BlindSquare is a wonderful accessible navigation app designed for 
people who are Blind and Visually Impaired (VIB).  Currently, VIB 
users can use GPS apps to know that there is a bus stop along a 
specific road; however, actual physical bus stop location is not 
provided.  (E.G. Where exactly is the pole that indicates the bus 
stop?)  We would like to have a list of the bus stop coordinates 
that can be opened in the BlindSquare app.  Ideally, the bus arrival 
times would also be included similar to the "Go Live" app that tells 
the user exactly when the next bus will arrive.  This could be web 
based - static web content.  Ed would specifically like bus routes 
for UNC, Raleigh, NC State, Cary routes (C Route?); there are many 
people who would take advantage of this information.

FYI:  There is a Carolina Team working on an App called U-Tag; 
this app is a way to specifically label college buildings, landmarks, 
and other Orientation and Mobility type information for VIB users on 
college campuses.  U-Tag is an app that enables a group of people to 
initially mark all of the Points of Interest and then this data is 
shared with anyone who is using BlindSquare.  U-tag is different from 
the bus information; even though u-Tag can mark specific bus stops.  
The bus project might access the transit authority’s bus stop 
coordinates or ?  (Does not necessarily need for someone to physically 
go and tag the bus stops as Points of Interest.)

I can put you in contact directly with the BlindSquare developers.  
They have worked closely with the U-Tag group and are very open 
to expanding how BlindSquare can be used.