Friday Hacks #27, Aug 31

Posted on by ejames

This week we have Angad Singh, here to talk about the research he did while at Stanford for NOC, and Michael Caronna, a professional photographer who built an automated motion-tracking camera stand.

Talk 1: Mininet: a network in your laptop (Angad Singh, NUS Student)

Talk Description: Multiple network paths are available today on one computer or mobile device - such as WiFi, 3G and LTE. The idea is to use all of them simultaneously to provide bandwidth aggregation (a fat pipe).

Also, I will be demonstrating this with the help of an open-source network emulator called MiniNet currently being developed at Stanford.

Speaker profile: I am an Year-4 Computer Engineering Undergraduate and I just got back from NUS Overseas College at Silicon Valley. I was the ex-president of NUSHackers.

I took classes at Stanford and interned at Skype. Apart from that I also worked for peanuts in a Stanford Networking lab for 2 months where I did this research after taking the course on Advanced Computer Networks.

Talk 2: From zero to motion tracking camera stand (Michael Caronna, Professional Photographer)

Talk Description: As a photographer covering sports and other subjects I have often used remotely triggered still cameras to allow me take pictures from angles impossible for a person to be in or to shoot from multiple angles simultaneously. About a year ago I began to wish for a cheap “smart” remote camera that would follow the action as it moved around the field. Not having any background in programming or electronics, I began looking around to web and found a bunch of paintball fanatics who had made autonomous turrets that would shoot opponents. Using that as my inspiration and proof that it was possible, I kept googling and eventually decided to do my project on a PC running Ubuntu using OpenCV, Python, the PlayStation Eye camera and the Arduino. Over the course of a year I studied Python and Arduino (attending Arduino classes at Hackerspace Singapore) and stealing heavily from online sources I cobbled together a pan-tilt head that will track specific colors and point an SLR at the target.

Prep: No prerequisites, but I’m happy to distribute the Python source for the hue tracking and servo movements (such as it is – it’s amateurish code) to anyone who wants it. Dependencies are Python 2.7, and OpenCV. To make it work as-is you need a webcam and either an Arduino or comment out the servo movement portion of the code. Being hackers I suppose everyone will be wearing black T-shirts, but if you want to have a little fun and be tracked by the camera wear a bright, solid-color shirt.

Speaker Profile:
Mike is a news photographer and photo editor from the U.S. based in Singapore. He has previously worked in Hong Kong, Australia and Japan shooting all kinds of news including sport, politics, business, and entertainment. Mike discovered Python and the Arduino a year ago and since then has been making a nuisance of himself at the Hackerspace Singapore.

Remember to sign up at: //bit.ly/friday-hacks

Location: COM1 SR3 [COM1/212] Time: 6pm - 9pm Free pizza and mingling @ 6pm, talks start at 7pm. You are welcome to stay and mingle (or hack!) after the talks.Please sign up at //bit.ly/friday-hacks

For a map, more details, as well as guidelines on giving a talk on Friday Hacks, see //nushackers.org/fridayhacks/ For more info on NUS Hackers? See: //nushackers.org/about For more Friday Hacks talks: //nushackers.org/

comments powered by Disqus