Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Wu Chih-Jen | en_US |
dc.contributor.author | Tsai Shung-Yung | en_US |
dc.contributor.author | Tsai Wen-Hsiang | en_US |
dc.date.accessioned | 2014-12-16T06:14:16Z | - |
dc.date.available | 2014-12-16T06:14:16Z | - |
dc.date.issued | 2012-02-14 | en_US |
dc.identifier.govdoc | G01C022/00 | zh_TW |
dc.identifier.govdoc | G05B019/04 | zh_TW |
dc.identifier.govdoc | G05B019/18 | zh_TW |
dc.identifier.govdoc | G06F019/00 | zh_TW |
dc.identifier.govdoc | G06G007/70 | zh_TW |
dc.identifier.govdoc | G01C021/00 | zh_TW |
dc.identifier.govdoc | G01C021/34 | zh_TW |
dc.identifier.govdoc | G08G001/123 | zh_TW |
dc.identifier.govdoc | G05D001/00 | zh_TW |
dc.identifier.uri | http://hdl.handle.net/11536/104607 | - |
dc.description.abstract | The present invention discloses an automatic ultrasonic and computer-vision navigation device and a method using the same. In the method of the present invention, the user guides an automatic navigation device to learn and plan a navigation path; next, the automatic navigation device navigates independently and uses ultrasonic signals and computer vision to detect the environment; then, the automatic navigation device compares the detected environment data with the navigation path to amend the physical movement track. The present invention enables ordinary persons to interact with the automatic navigation device without operating the computer. As the present invention adopts computer vision and ultrasonic signals to realize the functions thereof, the manufacturers can save the hardware cost. | zh_TW |
dc.language.iso | zh_TW | en_US |
dc.title | Automatic ultrasonic and computer-vision navigation device and method using the same | zh_TW |
dc.type | Patents | en_US |
dc.citation.patentcountry | USA | zh_TW |
dc.citation.patentnumber | 08116928 | zh_TW |
Appears in Collections: | Patents |
Files in This Item:
If it is a zip file, please download the file and unzip it, then open index.html in a browser to view the full text content.