Blogs from the Ranch

< Back to Our Blog

Clash of the Coders: 2017 Winners


Jade Hill

After a grueling 72-hour competition, our Clash of the Coders 2017 projects are complete and the results are in. With only a few points separating each of our teams, we present our three winners.

Third Place: The Winning Team

Chris Morris, Jordan Killpack, Sarah Shapiro and Joseph Dixon worked on an Apple TV project called Lasso BNR Dashboard to help employees stay up-to-date on Big Nerd Ranch news. This highly dynamic and remotely configurable dashboard platform offers live updates from any datasource including (but not limited to) Slack, GitHub, Twitter and Instagram.

Lasso Dashboard

Second Place: dropAndGimme(20)

Presenting NerdFit, an Android TV application, were Kathryn Thomas, Matt Raufman, Rashad Cureton and Step Christopher. In an effort to encourage Big Nerd Ranch employees to work out in the company gym, dropAndGimme(20) designed an app that provides a personalized, motivating experience to gym-goers using either remote or voice. In addition to keeping track of personal stats, the app showcases how-to videos for everything from bench squats to arm curls and allows users to see details of their next workout.

NerdFit Voice Interaction

And the Winners… The Polyglots!

There can only be one team that receives total Nerd glory, and this year that team consisted of Zack Simon, Bolot Kerimbaev and Jonathan Martin with their Course Correction project.

Course Correct Logo

The project came about from instructor feedback regarding the Big Nerd Ranch bootcamp. In a modern-day bootcamp, there is not much time for teachers to get fully acquainted with students, and The Polyglots looked to fix this. Using a browser-based AR viewer and markers for each students, Course Correction allows teachers to quickly associate a student’s face with their name, current book progress and responses to a pre-course survey. By tracking hand raises and student book progress (captured via each student’s PDF viewer), teachers can see which sections prompted the most questions and can record notes for course owners, allowing them to evaluate the material.

In order to create the app, The Polyglots used a number of technologies. The instructor’s augmented reality app is powered by React.js and uses A-Frame for declarative virtual reality, ARToolKit (a C library compiled to JavaScript with Emscripten) to find markers and calculate 3D pose, and Three.js for 3D drawing.

Teacher Mobile App for Course Correct

The student PDF viewer is an Electron app that wraps Mozilla’s PDF.js with some additional event tracking, which gets pushed to the other four apps in realtime with Firebase.

The camera in the front of the classroom uses OpenCV to detect hand raises. Haar feature-based cascade classifier detects faces and hands and, together with Aruco marker detection, provides the raw data. Faces and AR markers are paired to detect students. Since hands are typically attached to the same bodies as the faces, the camera app searches for hands in the vicinity of faces. When a hand raise is detected, the event is sent to Firebase.

Polyglots Raising Hands for Detection

And finally, to compute statistics, the Node.js server watches for new events in Firebase (such as a student advancing to the next page in the curriculum or raising their hand) and streams aggregate stats back to Firebase for consumption by the other apps. When the instructor is ready to notate rough spots in the curriculum, they use the React-powered instructor dashboard to see which sections of the exercises took the longest and caused the most hand raises.

Pretty awesome, huh? We think so too. Congratulations to all of our teams on surviving the competition. We’re so proud to call these Nerds our own. Until next year!


Jade Hill

Not Happy with Your Current App, or Digital Product?

Submit your event

Let's Discuss Your Project

Let's Discuss Your Project