Skip to content
/ Aamba Public

An Android App for visually challenged people. The app guides people to walk with voice feedback or vibration feedback.

License

Notifications You must be signed in to change notification settings

aamba/Aamba

Repository files navigation

Gitter

The Aamba Project

An Android App for visually challenged people. The app guides people to walk with voice feedback or vibration feedback.

Motivation

According to World Health Organization, Globally the number of people of all ages visually impaired is estimated to be 285 million, of whom 39 million are blind. People 50 years and older are 82% of all blind. So I came up with an app idea that will help blind people to walk safely indoor as well as outdoor. The app can very well recognize household items and humans. Idea is to detect such objects with your smartphone camera and tell you on which part of the screen(quadrant, 3X3, 2X4) the particular object is. The person will then take appropriate action.

Technology

The app uses Tensorflow lite as backend. The machine learning model used is is MobileNet SSD trained on the famous COCO dataset. You do not have to do anything to download these pretrained models , the gradle script handles it for you.


Plan

We will be adding a simple android app with tensorflow lite backend. This will work as backbone. task list:

  • Basic app
  • Add vertical line and output left or right
  • Add horizontal line and output quadrant number. e.g; human(object name)+in+left bottom(quadrant location), dog in right lower corner
  • Add an option to increase the number of lines/blocks.
    • ( 3 X 3)
    • (2 X 4)
  • Add haptic feedback mode(needs brainstorming as to how it cn be useful- do we need external device or smartphone's vibration motor?)
  • Integrating other features in this module

Download the app for your android device

https://github.com/aamba/Aamba/tree/master/Data/ReadyToUse-APK

How to build it yourself

  • Clone this repository to your local device.
  • Make sure you have latest Android Studio.
  • Open Android Studio, and from the Welcome screen, select Open an existing Android Studio project. Select this cloned folder.
  • Let the Android studio download it's gradle if it is your first time running the app in Studio.
  • (if error occurs then rebuild using build>rebuild)
  • Make sure your android device is connected.

Model used

MobileNet SSD

http://storage.googleapis.com/download.tensorflow.org/models/tflite/coco_ssd_mobilenet_v1_1.0_quant_2018_06_29.zip

Documentation

Coming soon. :) You can help with it.

Join th aamba Comunity

Click here to join Gitter

Contibutors

All contributors are welcome. Create Github issues and suggest new features.

  • Sunny Dhoke (@sunn-e)
  • 'Please add your name here after asterick'

About

An Android App for visually challenged people. The app guides people to walk with voice feedback or vibration feedback.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages