OSREI

Skills & Introduction:



OSREI or the Open Source Remote Education Initiative is a project that aims to solve educational disparity by providing high quality crowd sourced educational content through tablet computers and the cloud. OSREI is already in Vietnam, the Philippines and India!

Research Questions:

  • Can we deliver quality education through the cloud?
  • What is the improvement in learning retention due to technological support?

Detailed Description:

OIn certain remote regions across the world, children have poor access to good teachers, good advisers and good information. They receive little to no non-academic education such as digital literacy, visualization, career counselling etc. Meanwhile, schools in cities often have electronically aided education while simultaneously having excellent staff to help their students. Electronic aided education is least existent in places where it is most necessary, in areas where high quality educational resources are scarce. We use modern Android based tablets to facilitate Minimally Invasive Education (MIE) in these areas.

My Role

I founded OSREI with Tejasvi Mehra at The University of Hong Kong. I was in charge of organizing our expeditions, securing funding, building the website and building a script that allows us to upload and download educational content onto the android tablets via the dropbox platform. I am in charge of the Philippines and India chapters of the project. I established our partnership with the Yellow Boat of Hope Foundation and have worked with Jay Jaboneta and Anton Lim. Our work was featured in the HKU Annual Review as a key student project.

Why

I want to build technology that helps people develop solutions to their own problems so that we can in a decentralized way, make the world a better place. As a computer engineer, I believe Digital literacy is the key to empowerment - the internet is the ultimate learning tool and future of opportunity for someone who cannot afford/access a formal education. OSREI was the result of this belief.
I was inspired by Suguta Mitra's Hole in the Wall project and decided to upgrade it for this decade!







Mindorobots

Skills & Introduction:



The goal of this project is to empower coastal communities to affordably photograph, map and conserve their reefs. MindoroBot is a swarm-robot which can sail and create photogrammetric maps autonomously at a low cost with a laser quadrat.

Research Questions:

  • Can we generate 3D images from 2D images using Range Image Photogrammetry and a Laser
  • Can we use color correction and IP algorithms to find the reef cover autonomously
  • Can we reduce the cost of an autonomous reef imaging platform through the use of everyday technology

Detailed Description:

Coral reef ecosystems are some of the most diverse and valuable ecosystems on earth. A giant issue with coral conservation is that reef mapping is done by divers moving and photographing a PVC quadrat for every unit area of the reef. MindoroBot is a swarm-robot which can sail and photograph and map reefs autonomously at a low cost with a laser quadrat. We hacked an aerial mapping drone to be able to do this and ran our prototype in Mindoro island, the Philippines. The camera used was an off-the-shelf GoPro and a standard smartphone gimbal. The image stiching (and some photogrammetry) was done using the PIX4D suite for aerial drones. The project aimed at hacking existing technology to deliver a simple, usable reef mapping robot.

My Role

I founded Mindorobots with TED senior fellow Cesar Harada and Rohak Singhal. I organized the whole project from the University side - including funding, logistics, mentorship and recruitment. At Mindoro, I was in charge of hacking the GoPro and connecting it to the PIX4D Photogrammetry Mapping Software to create the maps. I assembled the gimbal fixture for camera stablization. Further, I built numerous laser quadrats to understand how the reef image can be converted to 3D from a 2D image. I worked on algorithms for pose estimation and robot navigation.

My work will be presented at RCUK 2018, London and EI 2019 in San Francisco! We won the runners up at the James Dyson Award and were broadcast on numerous news outlets for our work. See the project site or facebook for more!

I was mentored by Cesar Harada, Ken Chew, Eddie Yung, Prof. Edmund Lam and Prof. KS Lui and was supported by a team of 12 of the finest engineers at the University of Hong Kong.

Why

I want to build technology that helps people solve own problems so that we can in a decentralized way, make the world a better place. I believe automating tasks that human's are inefficient at and reducing the marginal cost of our infrastructure is the key to realizing the next industrial revolution. Inspired by Cesar Harada, I explored vision systems and was captivated by photogrammetry and stereo vison and it's incredibly impactful applications in the real world. Cesar inspired the Mindrobots team to think about how local communities can conserve their own reefs. Mindorobot's is an amalgamtion of these interests and inputs.












The Vayu Project

Skills & Introduction:


The goal of this project is to reinvent the way we explore our oceans.
The Vayu Project is a team effort by students and professors at the bionics and control lab at The University of Hong Kong to build the Guinness record-breaking fastest robotic fish.

Research Questions:

  • Can biomimetic propulsion deliver higher performance than propellers?
  • How can compliant sections improve the efficiency/energy output of the design?
  • Is it possible to engineer a research platform for studying undulatory motion

Detailed Description:

This project looks to engineering a robot that mimics the highly efficient natural motion of fish and delivers high performance and efficiency. The project aims to break a Guinness World Record shortly by making the Vayu robot the fastest robotic fish to have ever been built -we're trying to be faster than Phelps! The robot is a complex piece of technology and pushes the limits of bio-mimicry and underwater engineering like never before. Vayu in Sanskrit is a fast wind and Yu in Chinese means fish.

My Role

I founded the Vayu Project with Dr. Zheng Wang at the University of Hong Kong. I build all the electronics and code the radio control and vision systems that control the robot. I also help design a number of the 3D printed components of the body shell on Solidworks. Finally, as the founder of the project, I also lead the project direction and outreach.

My first research paper (and the first project paper!) were presented by me at PAAMES/AMEC 2018 in Busan about Vayu. I applied to the Guinness Book of World Records for "Fastest Robotic Fish" when I was in high school (2014) and got approval in 2016 to attempt after some correspondance. As a year 2 undergraduate I had few resources to start myself so I approached Professors at my University to see if we could do this together - thus VAYU was born. Do check out our facebook page!

Why

As a child I always wondered how fish glide so gracefully yet swiftly while motorboats were noisy and lumbering. Since then I've been fascinated by biology and its parallels to engineering. Soft robotics and biomimetics are areas where I wish to make huge impact with my work - I've been building humanoids, robot arms, prosthetics and soft-robots since I was 15. Vayu is my attempt to use biomimetics to reinvent the way we travel our oceans. In July '17 I completed a course on Soft Robotics under Prof. Hongbin Liu (KCL) at Peking University to better understand the dyanamics of a robot like Vayu.











NxtBraille

Skills & Introduction:



The goal of this project is to reinvent the realtionship that the visually impaired have with technology. At NxtBraille we are developing open-source, freely distributed, integrated braille reader/keyboard for smartphones. Once our tactile interface is perfected - our goal will be to see if we can send minimalized images from the smartphone camera to the body through visual-tactile sensory substitution.

Research Questions:

  • How much data can we send through touch/braille interfaces to a visually impaired individual?
  • How can we minimalize the data from our smartphone being sent to the body?
  • Can we reduce the cost of tactile interfaces by changing the underlying technology?

Detailed Description:

Current refreshable braille displays cost upwards of 700USD. We're replacing the expensive piezo system with a new haptic interface to help cut the cost to under 20USD. Our idea is to make braille based smartphone access more convenient/accessible for visually impaired people and help support greater digital literacy in the community. Since the project is open source, anyone can freely add to the project, improve it or replicate it or sell it without any patenting royalties. This will significantly reduce the cost and increase the availability of these devices.

The team travelled to X.Factory, Seeed Studio in Shenzhen and met with Josh Grisdale from accessibility Japan, Emi Aizawa and Mark Bookman in Tokyo. Our final goal will be to restore vision completely using the vibrato-tactile interface and sensory substitution.

My Role

I founded the NextBraille with Rohak Singhal at the University of Hong Kong. I was entirely responsible for the ideation, design and implemention of the keyboard interface. I also came up the the "Vibraille" concept - a vibration based haptic interface that allows the person to communicate with their device through their skin using sensory substitution . A team of 8 of the finest engineers at HKU then helped implemented this idea into a reader.

Why

My desire is to build technology that will enable others to more fully fulfill their potential. I wish to reinvent the relationship that we have with our technology. I love experimenting with interfaces. I often ask myself "How can we communicate better with our technology". I love thinking about data-minimalization and how we can abstract data and feed it into the body using different techniques. Inspired by David Eagleman's Sixth Sense vest I've experimented with haptics, EMG's etc. My empathy towards the visually impaired community had grown while studying how our brains percieve depth and color for another project. I simply cannot fathom a life without color, depth and visual perception. NxtBraille was born of all these interests.










Open Handuino

Skills & Introduction:



TThe goal of this project is to build technology that will allow others to fully filfill their potential. The aim is to build a prosthetic hand which allows amputees to non-invasively control and "feel" back from it (retain sensory input) just like a biological hand.

Research Questions:

  • Can we help amputees regain the sensory perception of having limbs?
  • Can we build an open source product that can be built entirely with off the shelf parts?
  • Can our prothetics become seamless extensions of our body just like natural limbs?

Detailed Description:

The goal of this project is to build technology that will allow others to fully filfill their potential. Although numerous companies in the open source prosthetics space - none are using open hardware and off the shelf resources - making users depend on their hardware/software systems to be able to use the designs. We are trying to change that by building an open source hand which allows users to control and well as feel back using their prosthetic limbs. The goal is to deliver a product that anyone can download for free and 3D print and whose components can be bought off the shelf from any large online retailer like Amazon or Taobao. We are using open hardware platforms like the arduino to be able to make this possible. We aim to deliver a design that can be scaled easily for any size and would cost under 100 USD to build yourself.

My Role

This project is my Final Year Project under Edmund Lam at the University of Hong Kong. As a result of the University course structure, this project is solo!

Why

Part of my passion is a desire to build technology that will allow others to fully filfill their potential. I was inspired by Hugh Herr's TED talk an I've been workin on prosthetics since. I strongly believe in Stephen Hawkings idea of the human experiment being able to transcend their own capabilites. I'm also a strong proponent of the open source movement. However, I notice that open-prosthetics aren't really open source - they have proprietary electronics. During NxtBraille, I thought about how haptics could also be used to interface data with sighted people. How cool would it be if I could communicate with my phone (read texts, send messages etc) just by thinking? "How much data can we interface through our skin?". Handuino is a result of all these ideas.










Bipedal Robot

Skills & Introduction:


I have, from a young age, driven by a powerful desire to break barriers. This project's goal was for me to break a national record before my 16th Birthday. I built the smallest bipedal robot in India.

Research Questions:

  • Can I use feedback and control to make a biped that can balance and walk?

Detailed Description:

The robot robot is only 5 inches tall and perfectly capable of walking. It can also kick, move down slopes and dance! I learnt the basics of feedback and control loops through this project. More here!

My Role

This project was a solo work!
I was award the a national record by the Limca Book of Records in India. My work was featured on numerous Indian news channels and papers.

Why

My parents run a small manufacturing unit outside Bangalore, India and when I was 13, my dad engineered a CNC machine to mill certain component they were manufacturing. I was amazed. This was my first real exposure to practical "robotics" and around this time I also came across the Honda Asimo - pre-Atlas this was the fanciest humanoid robot in the world. I was learning C++ in school and had just bought myself an Arduino and a servo motor with my pocket money. That's when I decided I was going to build a small Asimo with what I had (I couldn't afford larger servo's anyway). That's how the biped came to existance!










Robotic Arm

Skills & Introduction:


This robotic arm is 6 inches tall and is complete with a multipurpose gripper. I built it when I was 17.

Research Questions:

  • Can I model the position of the end-effector of a robotic arm?
  • Can I make an equation to find the torque at each joint of the arm?

Detailed Description:

This robotic arm is 6 inches tall and is complete with a multipurpose gripper. I built it when I was 17. The idea was to learn how a robotic arm works. I self-learnt a lot of theory about robot kinematics through this project. This would later become my second national record with the Limca Book of Records.
More here

My Role

This project was a solo work!
I was award the a national record by the Limca Book of Records in India. My work was featured on numerous Indian news channels and papers.

Why

My parents run a small manufacturing unit outside Bangalore, India. Their office is on the first floor and you can see the shop floor below. As long as I've visited their plant (and even today) I'm always irked by the task of the operator of the 500 ton press. It's the monotony of the task of the operator that gets to me. Put a part in, press a button, take it out, and repeat (for 8 hours a day, for 6 days a week). Ever since I started tinkering with servos and my arduino I felt my goal should be to automate the factory (specifically the 500 ton press) using what I knew. My dad told me a FANUC arm costs ~18000USD and my arduino's & servo's cost 60USD. Hence, the quest to build a robotic arm. Although I didn't automate the factory - I self-taught and learnt a lot about robot kinematics. Later this would drive me to study Soft Robots at Peking University and I now look forward to bringing "5 Minute Automation" into the world!











Fire iVacuation System

Skills & Introduction:


A smart IOT based fire evacuation system that monitors the fire data to display an ideal escapte route in realtime.

Research Questions:

  • Can I develop an algorithm that determins the best escape path for a given data constraint?

Detailed Description:

A huge number of lives are lost every year to fire. Studies show that a major cause of death during fire is inability to escape due to route blockage. I built a smart IOT based fire evacuation system that monitors the fire temperature, CO2 levels etc and displays an escapte route in realtime to help unfamiliar evacuees escape safely. Further, it maps the fire for the fire department in realtime so that they can optimize their attack strategy.

My Role

This project was a solo work!

Why

My quest is to leverage technology to solve problems and overcome challenges for humanity. Fires take countless lives each year. I decided I was going to try and make a difference!






3D Box Scanner

Skills & Introduction:


A Kinect based 3D box scanner - just rotate your box in front of the camera to generate the 3D model!

Research Questions:

  • How can we 3D scan an object by holding and rotating it on front of a camera?

Detailed Description:

I love the concept of hacking existing platforms (smartphone cameras/sensors etc) to create new technology solutions. It appeals to the jugaadu Indian in me and I feel it's the most powerful way of empowering people to access new technology.

This project uses the popular Xbox Kinect to 3D scan boxes! Just show all the sides of the box (roll it around) to the camera and the code will return a 3D point cloud of the scanned box!

My Role

This project was a solo work! Code is up on github.

This project was submitted as my coursework project for the MSc Advanced Vision course at the University of Edinburgh supervised by Prof. Robert Fisher.

Why

I'm amazed by how our eyes help us percieve distace through stereo vision. I also love my Xbox and its Kinect system. I decided to hack the two together to make a "3D scanner" that can scan boxes for an Advanced Vision course assignment at the University of Edinburgh!





CodeXpress Myanmar

We built 6 self-sustaining open source IOT devices to monitor agricultural data in Myanmar. Myanmar is leapfrogging many technologies due it's entirely new infrastructure. We built these devices at HKU and with the support of Dagon University in Yangon, deployed these devices in the field to test them. Later our work was presented at Myanmar's largest startup ecosystem hub, Phandeeyar.

My Role

I was the head of hardware development on this team. Check out the hardware github for more.

Why

I love how new technologies disrupt industry and to me Myanmar is a fascinating country. They have the cheapest 4G in the world. Everyone (even the lowest income tier) owns a smartphone. Most of the country uses cash or mobile payments - most do not have bank accounts. No email ID's - only facebook. It's an environment like no other and the opportunity to be part of the new technology in this country was one I could not refuse!






OSREI

The Mindorobots Project

The Vayu Project

NextBraille

Handuino

Bipedal Robot

Robotic Arm

Fire iVacuation

PC Box Extraction

Smart Home (IDP)

Compliant Robot Simulation

GPU Mapping Realtime

Shirt Folder

PCB Manufacture: NE555 Organ

7 Stage Rube Goldberg Machine

Fish Feeder

MTR Madness!

Sentry Gun

Logo/UX Design

CodeXpress Myanmar

The White Room (Utrecht)

Trapped Documentary (2017)

Model United Nations

Music Composition

Portfolio


"I'm driven by an ambition to empower people to break barriers and transcend their limitations so that they can realize their full potential."

"My quest is to leverage technology to solve problems and overcome challenges for humanity. I hope to build technology that empowers people develop solutions to their own problems so that we can in a decentralized way, make the world a better place."

I am fascinated by the relation between technology and nature, it’s modelling, interfacing and analysis to solve problems. I love the synergy of hardware and software, of innovation and the economy. I chase connections between different areas of study, the transdisciplinarity and translations of ideas in one domain to another to think out of the box.

I hope my work will impact the Environment -> People -> Technology -> Economy.

I broke a national record at the age of 16 for building India’s smallest Bipedal Robot, I spoke at a TEDx event in Hong Kong in 2017, am a Mensa member.

I’m on the dean’s list at HKU and I have presented/will present papers at PAAMES/AMEC 2018, RCUK 2018 and at EI 2019 IRIACV.

I've won awards at the James Dyson Award, 81' Inclusion Fund, Gallant Ho Experiential Learning Fund, Government of India Top 1% Award, SY King Prize for Academic Excellence, HKSAR Government Reaching Out Award, Arthur and Louise May award among others.

I’ve spoken at TEDxCityUHongKong, the EARS@UoE and at the Own Future Fair and studied at the University of Edinburgh, Peking University and Utrecht University through exchange programs.

I use and create technology to disruptively solve problems via a transdisciplinary, design thinking approach. A list of my major projects can be found above.

I believe creativity is driven by diversifying experiences. I’ve worked on Marxist Alienation, Coral Reef Bleaching, Flipped-Education and Neuroscience, Business Consulting (via Eureka Consulting Group, HKU), am a Trinity College of London certified Grade 7 Pianist and have traveled to 25 countries in a quest to understand cultures and represent my own.

Near-Future: [Jan-May 2019] I shall be working with Alessandro Ponzo, to build an open source photogrammetry suites & BRUV’s. I shall be working on Clearbot in Bali - a biomimetic ocean plastic cleaning device and at Phandeeyar in Burma to consult for startups.





Influencer. Maker. Nomad. Blithering Idiot.

Sid likes to ask, "When do people stop having fun in life?"

When Sid isn't working he's working on ticking off items on his osumposum bucketlist.

Sid has a multi-layered vision-why-quest definition. Check out the dynamic doc here.

Sid tries to improve by 1% everyday by learning and applying ideas. He maps his knowledge and learning here.
Updated scans of his ideas book will be put up here soon!

A timeline of Sid's life can be found by stalking his facebook page.

Near Future: Currently I'm working on writing a book about robotics for kids, starting a YouTube Channel and a new GIS & time based personal network mapping (think personal visual-social mapping) software to help me build valuable, long lasting relationships.

Here's a small map of where Sid has lived and travelled!

Media


TEDxCityUHongKong 2017 | How To Be A Time Lord

Chronologically Arranged Newspaper Articles

Smallest Bipedal Robot India (2013)

Smallest Robotic Arm India (2015)

Harvard Model United Nations (2014)

Indian Board Topper (2015)

IoT Hackathon by ITC (2015)

Mindorobots (2018)

TV: Smallest Biped India '13

TV: Model United Nations '14

TV: Smallest RobotArm '15

TV: Mindorobots Philippines '18



Sidhant Gupta
304, 5th Main, 1st Block,
Koramangala, Bangalore - 560034,
Karnataka, India.
(+91) 9902801600
(+852) 51385925

Reach me at sdhnt@connect.hku.hk