Clash of the Coders: 2017 Winners
After a grueling 72-hour competition, our Clash of the Coders 2017 projects are complete and the results are in. With only a few points separating each of our teams, we present our three winners.
Third Place: The Winning Team
Chris Morris, Jordan Killpack, Sarah Shapiro and Joseph Dixon worked on an Apple TV project called Lasso BNR Dashboard to help employees stay up-to-date on Big Nerd Ranch news. This highly dynamic and remotely configurable dashboard platform offers live updates from any datasource including (but not limited to) Slack, GitHub, Twitter and Instagram.
Second Place: dropAndGimme(20)
Presenting NerdFit, an Android TV application, were Kathryn Thomas, Matt Raufman, Rashad Cureton and Step Christopher. In an effort to encourage Big Nerd Ranch employees to work out in the company gym, dropAndGimme(20) designed an app that provides a personalized, motivating experience to gym-goers using either remote or voice. In addition to keeping track of personal stats, the app showcases how-to videos for everything from bench squats to arm curls and allows users to see details of their next workout.
And the Winners… The Polyglots!
The project came about from instructor feedback regarding the Big Nerd Ranch bootcamp. In a modern-day bootcamp, there is not much time for teachers to get fully acquainted with students, and The Polyglots looked to fix this. Using a browser-based AR viewer and markers for each students, Course Correction allows teachers to quickly associate a student’s face with their name, current book progress and responses to a pre-course survey. By tracking hand raises and student book progress (captured via each student’s PDF viewer), teachers can see which sections prompted the most questions and can record notes for course owners, allowing them to evaluate the material.
The camera in the front of the classroom uses OpenCV to detect hand raises. Haar feature-based cascade classifier detects faces and hands and, together with Aruco marker detection, provides the raw data. Faces and AR markers are paired to detect students. Since hands are typically attached to the same bodies as the faces, the camera app searches for hands in the vicinity of faces. When a hand raise is detected, the event is sent to Firebase.
And finally, to compute statistics, the Node.js server watches for new events in Firebase (such as a student advancing to the next page in the curriculum or raising their hand) and streams aggregate stats back to Firebase for consumption by the other apps. When the instructor is ready to notate rough spots in the curriculum, they use the React-powered instructor dashboard to see which sections of the exercises took the longest and caused the most hand raises.
Pretty awesome, huh? We think so too. Congratulations to all of our teams on surviving the competition. We’re so proud to call these Nerds our own. Until next year!