Nuclear & Applied Robotics Group
Nuclear & Applied Robotics Group
The Nuclear and Applied Robotics Group is an interdisciplinary research group at U.T. Austin. Our Mission: develop and deploy advanced robotics in hazardous environments in order to minimize risk for the human operator. For more detail, please visit our official website.
Пікірлер
can u use vision pfro
Hemisphere of something spicy?
Any source that could produce the robot arm behavior like video above?
Promo`SM 😭
Hanzhen harmonic drive gear , strain wave reducer, robot joint , over 30 years experience
Hanzhen harmonic drive gear , strain wave reducer, robot joint , over 30 years experience
so touching for an excellent video
the best video I have seen for a while !
Very interested in your work.
Thank you for your video, what is the name of this device? @ time stamp 1:02
Is there any contact address where I can reach you?
nice demo, does code available in the public repository?
good work.
Nice project and good job. Can I Ask if the 90' wheel turn system (at 1:30) is purchased somewhere as std, or something that you have made for this project? If this is std parts, can you pls. provide me with info / link to the dealer :-) Thank you!
I have the same question with Zhu, is there some API could get the joint torque? Any suggestion would be grateful.
Are you using ur_modern_driver ? How do you get joint torque from the robot.
Hi, did you find? how to do it?
@@vladislavantipov7484 actually there seems no way to get the joint torque from currently supported driver, I guess the admittance control of this video is based on wrist sensor, and sending ik velocity to each joint
@@Alexander-jz1cp try universal robot RTDE,
This was too slow. You should work on making it a lot faster.
Hi Sir, how is the robot is controlled? by a joystick?
Hi Steven, our favorite way is a SpaceNavigator Pro, which is like a joystick. It only requires one hand. You can use an XBox controller, too, or a LeapMotion controller. The XBox controller is also good and the LeapMotion controller is good for noobs, but annoying for more difficult tasks.
Hi sir, I am planning to control a UR5 robot arm with Leap Motion or Kinect non-contact sensors in real time in ROS. I am an absolute beginner. May I please ask if I can use your package to developt it? Now, I am able be control the UR5 arm very well in Rviz and Gazebo by doing motion planning with Move it. (by following this tutorial wiki.ros.org/ur_gazebo)However, what I wanna do is real time teleoperation not first plan the trajectory then execute. Can you please tell me how I can do real time control of UR5 arm in ROS? secondly, I've noticed in Rivz you can simply drag the tool centre point of UR5 to do the motion planning before execute. Can I only control the tool centre point in the teleoperation to control the whole arm? Coz I only care about the trajectory of the tool centre point. Or I am going have to do mapping for all the joints of the robot to control in real time? Thanks a lot.
Please feel free to use the package! You will only have to worry about the tool center point -- the package will handle the math for the robot joints. I'm linking to your Github issue in case others have a similar question: github.com/UTNuclearRoboticsPublic/jog_arm/issues/60
Pretty cool , but too slow
Very good