Posted On

WigWag Collaborates with USC Mobile Networks Design Master's Class

The past Spring WigWag collaborated with a University of Southern California Master's class (USC EE579 - Wireless and Mobile Networks Design and Laboratory) lead by Professor Bhaskar Krishnamachari.

The class was comprised of 15 students with the objectives to learn about and get hands-on-experience developing with low-power wireless networks, mobile and IoT devices. This project also gave the students the opportunity to architect and develop an end-to-end IoT system that included the development of Android and Cloud based applications.

As part of their final semester project the students worked with deviceJS and hardware provided by WigWag to develop IoT applications on our platform. Below are the class projects, you can find more information on each project using the individual project link. You can find the main projects page here.

We think the students did an outstanding job!

USC Class Projects leveraging WigWag's Technology

Project: Roomba + 2D mapping - Roomba Obstacle Mapping Project Detail Page

The roomba is the autonomous vacuum cleaner. For this project the students extracted the spatial information from it to create a 2D map of the physical environment.

“The promise of IoT is smart everything”. In fulfilling this promise for robotics, localization is the next frontier. The challenge is to achieve a solution without actual human presence. The student's other motivation was to look for a different approach than current solutions available for indoor localization.

Roomba mapping with beacons

Project: MusicPlayer, EmotionDetector, Visible Spectrum - ECMusic Project Detail Page

The aim of this project was to build a mediaPlayer application on an Android phone. Apart from the basic mediaplayer functions including play, pause, continue, stop, previous and next song, it can automatically play a song according to user’s emotion based on ECG and body temperature sensors. It displays the spectrum both on screen and using the WigWag filament smart LED bulbs.

ECMusic Demo

Project: Amazon Echo (Alexa) + Rules - IoT Device Control With Amazon Echo

WigWag supports Alexa, using our APIs you can control supported light bulb's brightness, color and power. In this project the students added custom skills to allow users to further automate their environment using voice commands to create "when-then" rules.

Alexa Project Introduction

Here are a some example voice commands:

  • "Alexa, ask WigWag to turn the lights to warm white at 6pm every day"
  • If you have a mood named "Red filaments" which turns all the lights to red then "Alexa, ask WigWag to enable mood Red filaments at 7pm on Wednesday"

Advanced option: Set up rules through conversation

User: "Alexa, ask WigWag to setup a when-then rule"
Alexa: "Okay. WigWag is ready. Specify your when-then request"
User: "Alexa, ask WigWag when motion occurs then turn on patio light"

Filament Control Using Amazon Echo

Sense and Light using Amazon Echo

Project: Foscam Wifi camera based facial recognition - Smart Home Security Project Detail Page

In this project, the students created a website enabling users to complete email registration and upload a picture to be used for identification. When a user appears in front of the camera, the users presence is detected via facial recognition. Using a database of users and deviceJS, the students were able to trigger specific rules and events based on which user was identified.

A real world example of this technology would be when someone comes home and is recognized by a front door camera, they are then greeted with the execution of certain rules, such as turning on the lights or unlocking the door.

Facial Recognition for Smart Home Security

Project: Spotify/Pandora + Sonos + Filaments - Visual Symphony Project Detail Page

In this project the students created visual art by representing the characteristics of music visually with light. Using a product such as Sonos to play music and filament smart LEDs to change the lighting, this could be used to create a great visual accompanied by music in any smart home.

The project can use a microphone to listen for the music or get the music directly from a service such as spotify. It converts the frequency domain, takes the average of the amplitude and maps that to the brightness of the bulb, then each color is assigned a frequency band. This data is then fed into devicejs which controls the bulbs. So, as the music plays the lights change colors.

Visual Symphony using 1 filament

Visual Symphony using 2 filaments

We would like to congratulate Professor Krishnamachari and his students for a job well done!

Comments from WigWag Talk