Massively parellel real-time network architectures for robots ca

Boots – shoes – and leggings

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

36441302, G06F 1520, G06F 1542

Patent

active

048520187

ABSTRACT:
A real-time network enables robots to accurately learn sensory motor transformation and to self-train and self-calibrate operating parameters after accidents or with wear. Combinations of visual and present position signals are used to relearn a target position map. Target positions in body-centered. visually activated coordinates are mapped into target positions in motor coordinates which are compared with present positions in motor coordinates to generate motor commands. Feedback provides calibrated error signals for adjustment of learned gain with changes in the system due to aging, accidents and the like. A series of prestored motor commands may be performed with a later "go" command.

REFERENCES:
patent: 3950733 (1976-04-01), Cooper et al.
patent: 4044243 (1977-08-01), Cooper et al.
patent: 4254474 (1981-03-01), Cooper et al.
patent: 4319331 (1982-03-01), Elbaum et al.
patent: 4326259 (1982-04-01), Cooper et al.
patent: 4450530 (1984-05-01), Llinas et al.
patent: 4648052 (1987-03-01), Friedman et al.
patent: 4676611 (1987-06-01), Nelson et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Massively parellel real-time network architectures for robots ca does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Massively parellel real-time network architectures for robots ca, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Massively parellel real-time network architectures for robots ca will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2363475

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.