XELA Robotics is currently rolling out its features for the upcoming UAi Software and the UAi App. Our software solution and App will host many features to improve the overall uSkin experience and benefit our users.
For more detailed information on each function, please scroll through the page or navigate with the buttons below.
Below is a list of all release functions available with our sensor software.
These functions can be added or are already included, enhancing the capabilities and performance of our sensors.
All measurements are visualised in real-time, either in Windows or in Linux.
Provides the coordinates of all contacts. Different contact areas are separated, and the centre of each contact area is provided.
Obtain tactile measurements. Our software collects the measurements from all skin patches and prepares them for your application. Currently, we provide the measurements in Windows and Linux, as well as for ROS.
The sensor measurements could slightly drift due to temperature changes. We can remove this temperature drift from the measurements by using temperature reference sensors.
Our sensors can have interference from nearby magnetic fields. Using our patented technology, we remove this interference by using reference measurements.
This function is an optional add-on.
The raw measurements are converted into force measurements in Newton. XELA Robotics offers two options for calibration. Both calibration options give the user more control over the grasped object and enable uSkin to measure force in Newton, enabling you, for example, to grasp objects with a predetermined force.
Grasp objects with predefined force. Set the desired grasping force, and our software ensures that the object is grasped with this force.
Our tactile sensors can measure 3-axis force, not only pressure, and can be customised for your specific application.
This gives robots a human-like sense of touch, allowing them to grasp and manipulate objects precisely.
All measurements are visualised in real-time, either in Windows or in Linux.
Obtain tactile measurements. Our software collects the measurements from all skin patches and prepares them for your application. Currently, we provide the measurements in Windows and Linux, as well as for ROS.
Provides the coordinates of all contacts. Different contact areas are separated, and the centre of each contact area is provided.
Our postprocessing functions such as force calibration improve the overall quality of the tactile data collection.
Additionally, magnetic interference and temperature drift can be removed by using our patented technology.
The sensor measurements could slightly drift due to temperature changes. We can remove this temperature drift from the measurements by using temperature reference sensors.
Our sensors can have interference from nearby magnetic fields. Using our patented technology, we remove this interference by using reference measurements.
This function is an optional add-on.
The raw measurements are converted into force measurements in Newton. XELA Robotics offers two options for calibration. Both calibration options give the user more control over the grasped object and enable uSkin to measure force in Newton, enabling you, for example, to grasp objects with a predetermined force.
Active Function (for uSPa 44)
XELA Robotics offers two options for calibration.
The raw measurements are converted into force measurements in Newton. XELA Robotics offers two options for calibration. Both calibration options give the user more control over the grasped object and enable uSkin to measure force in Newton, enabling you, for example, to grasp objects with a predetermined force.
Free
For this type of calibration, all uSkin sensors are calibrated with XELA’s universal parameters according to our patented technology.
This feature is free of charge.
Option 1
Add On
For this type of calibration, each sensing point is calibrated individually. Slight differences between the sensing points are equalised to guarantee a more uniform response.
This type of calibration improves the sensor’s accuracy, resulting in a more detailed data collection.
Option 2
Only the sense of touch can tell you if you are: grasping the object with the right amount of force, if the object is slipping out of your hand, and so on.
Our grasping functions are designed for robotic integration to improve the overall interaction with a particular object.
Grasp objects with predefined force. Set the desired grasping force, and our software ensures that the object is grasped with this force.
Detect the onset of slip. We can detect slip within a few milliseconds, and the gripper can respond to it, for example, by grasping the object slightly stronger.
Predict grasp success. Even before lifting the object, we can distinguish stable from unstable grasps. This allows the opportunity to re-grasp the object in advance and ensure that it will not be slipping out of the hand while transporting it.
After detecting an unstable grasp, we provide suggestions for how to change the grasp so that it will be more stable. In particular, the object could be grasped stronger, if permissible, or the position of the grasp can be modified.
If the grasp is too firm, the object is deforming. After detecting this, the grip strength can be reduced if so desired.
By collecting data about grasps, the algorithm automatically improves due to the additional training data.
Our tactile property recognition functions will provide the user with a much better understanding of a particular object by revealing the internal and external properties of the specific interaction.
Recognise objects from a database of previously memorised objects. By slightly touching the object, its tactile features can be recognised, which can be used to recognise objects.
This can be used, for example, to ensure that the grasped object is the correct one. It works in addition to vision, for example, when vision fails due to occlusions or poor light conditions.
Localise objects in the hand. This function reveals the object’s relative position in relation to the gripper.
Detect the orientation of the object within the gripper. The correct orientation of the object is crucial for various tasks, for example, when inserting the object in a hole.
In addition to visual information, the local object shape can be obtained through tactile sensing. By repeatedly touching the object, the overall object shape can be obtained, and this information can be used to handle the object correctly.
Provides various information about the contacts.
For each contact area, we will provide information about the curvature radius of an object and can recognise contact area properties such as edges, corners, and flat surfaces.
Detects the adhesion of an object by slightly rubbing the object.
This provides valuable information for deciding how to grasp the object reliably.
Detects the weight of an object by slightly lifting the object.
This can be used for quality control and/or for deciding the necessary grasping force.
Detects the roughness of an object. By slightly moving across the surface of an object, its texture can be detected.
This can be used, for example, for quality control and for appropriately grasping the object.
Detect the stiffness of an object by slightly squeezing an object. This can be used, for example, to make sure not to grasp objects too hard to avoid breaking them.