This summer Joe Kale, with the supervision of Prof. Michael Thompson, developed Nerv, a software framework designed to organically deliver information to users. The work was done in the Department of Electrical and Computer Engineering, with funding provided by PPL Undergraduate Research Fund. Nerv's primary component is an application on Google Glass, a wearable, head-mounted computer. The application leverages the heads-up display (HUD) of Glass to overlay supplementary information to a user, based on where they are or what they are looking at.
There are a couple of primary situations in which Nerv was designed for. First is a guided tour. An organization, like Bucknell, would pre-populate a database with a data for a tour in the form of text, images, audio, and/or video. Users then would walk around to different places along the tour and prompt Glass to give them more information. A user might walk to the campus library and prompt Glass for more information on it. Nerv would probe the database for content pertaining to their current location loading it into Glass. Users could then read, watch, or listen to information about the library that is not immediately available to them otherwise.
The other primary situation is as a supplementary training tool. For example, a student may be in a laboratory environment and forget how to use particular piece of equipment. Again an organization would have populated a database with training materials, tutorials, and references that would be of interest to someone using a particular machine. The user would prompt Glass to scan a Quick Response (QR) code on the machine, which would direct Nerv to download relevant materials from the database, giving the user the information they need to use that equipment. Glass lends itself particularly well in this scenario as the information will not obstruct the student's work; Glass's display sits out of primary sidelines until used.
Nerv began as a project last year as an independent study of two graduates, Daniel Prudente and Alex Meijer. During the summer, Joe picked up the project by providing relatively high-accuracy location awareness to Nerv, allowing users to access content simply by where they are standing, not just looking at a QR code. To do this, he implemented a modified version of the RADAR location tracking system developed by P. Bahl and V. Padmanabhan at Microsoft Research. This is one of the first implementations of this generally widely used system on the Glass platform.
To implement this system, Joe had to take into consideration many of Glass's sever [should this be server or severe?] limitations such as extremely limited battery life, low processing power, and a small array of sensors. Additionally, Nerv is meant to be a real-world system, while RADAR was largely made and tested in a controlled environment. Since RADAR is based off WiFi signal strengths, overcoming the unpredictability of a real-world scenario was extremely difficult as WiFi signal strengths are already unstable. Joe began the summer by testing the viability of leveraging the existing Bucknell WiFi infrastructure for RADAR implementation. The signals in Breakiron and Dana Engineering were found to be mostly stable, albeit less than ideal. Joe combined techniques such as software filtering and analytics of signals and dynamic signal variance thresholds to tailor the RADAR system towards Nerv's needs. The result is that Nerv can locate users to room level accuracy.
Joe also worked on various other parts of Nerv including basic security measures, GPS functionality, and modularization of code. The project is currently open source and hosted on Github.