Diagram sketch of our prototype, drawn by Arya Rahnama, images are CC BY 2.0

 

Above is an illustration of a prototype of our solution. Provided are five “perspectives” of the device: a view of the front of the device, an aerial view of the top of the device, a side profile, a view of the device in a standard grip, and a view of the internals. The housing for the electronics will be made of a sturdy plastic, and there will be two circuit components in addition to the Arduino. The first circuit will process the data from the sensors and send it to the Arduino and the second circuit. The second circuit will interact with the main mechanical tube and actuate the pins; actuation of the pins enables the user to “see” a low resolution image of their surroundings using their sense of touch. The Arduino interacts with the speaker on the device, allowing audio to be played, and also enables wireless communication via Bluetooth to other devices such as cell phones.

 

Proof of concept of (similar) prototype, © 2020 Shane Wighton

 

In this clip from Shane Wighton’s YouTube channel, he is demonstrating himself testing a similar device. Shane’s device uses an iPad to collect the Lidar data used to generate an image of his surroundings, which is connected to a pin cylinder that he created. In the illustration above, one can see that there is no iPad included, but instead a custom made casing with a sensor to collect Lidar data; the pin setup in the diagram is analogous to Shane’s. While these two solutions have similar functionality and solve the issue to a similar degree, the latter has the advantage of being much cheaper as compared to having an iPad with an additional device attached. The lower cost yields much more accessibility to the solution, and also is meant to be more resistant to falls as compared to an iPad due to the difference in material and lack of screen. However, as the two devices are very similar, this clip serves as an adequate proof of concept that an individual with impaired vision can use it to navigate their surroundings.

Print Friendly, PDF & Email

8 thoughts on “Device Illustration

  1. If you are able, increase the size of the diagram; the text and shapes are just too small. Cite the name of the person who drew the diagram. It is unclear if you borrowed someone else’s design, since you are demonstrating someone else’s similar design below, in the video.

    Reply
    • I tried to increase the size of both the diagram and video to make them more clear. I was the one who drew the diagram, so I have added my name under the diagram.

      Reply
  2. The diagram makes the device look rather large and cumbersome and the video shows the tester moving rather slowly. How effective would you expect this device to be compared to a simple cane?

    Reply
    • We mentioned that the device may require some training before one is fully comfortable using it, and in the clip we showed there was no such training beforehand. Additionally, a simple cane is best at identifying obstructions that are close to the ground, and could not, for example, notify someone about a hanging branch they are about to walk into. Additionally, our device can gauge distance to objects that are in front of the user, meaning that they can be aware of their surroundings even if they are too far away to have physical contact with, such as a cane.

      Reply
  3. How effective would this device be in areas with uneven surfaces? Could the sensor detect something like a bush from far away enough to alert the user? What about obstacles that are close to the ground?

    Reply
  4. Now that I can clearly see the diagram, I have more questions…hah!
    I would perhaps break up the diagram into 3 separate images with their own paragraphs to avoid confusion.

    1. The grip, top, and side views in one image, to help readers understand the ‘device’s footprint/profile’. And, instead of FOV use Field of View. This ‘internationalizes’ it more than using acronyms that might be hard to determine.

    2. The pins and sensors with a paragraph that describes each sensor…or perhaps list each sensor’s name with its purpose.

    3. Electronics, CPU, Arduino with its own paragraph that describes each component and what it controls.

    4. The software Note the use of the speaker and haptic feedback.

    Reply
  5. Did you consult any blind students, faculty, or acquaintances with your idea, and what did they say about the possibilities?

    Reply
  6. It would be interesting to see this idea implementing with a cane/walking stick, which is a tool that many blind people are probably already accustomed to. It serves it’s own purpose while this technology simply supplements that.

    Reply

Leave a reply

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong> 

required