AI-translated latest Q&A posts' collection (2023/10/21–2023/10/27)

This post compiles the latest questions and answers from the Q&A category translated by AI. The questions were posted between October 21, 2023, and October 27, 2023.
Please note: AI has its limitations, and specific content should still be discerned based on the actual context.

Relationship between the Geometric Center Point (XYZ) of Automatically Generated Objects, the Camera Reference Frame (XYZ), and the Robot Reference Frame (XYZ)

1. Q1 (2023/10/23)

What is the relationship between the geometric center point (XYZ) of automatically generated objects, the camera reference frame (XYZ), and the robot reference frame (XYZ)? Can you explain it?

1. A1 (2023/10/23)

Hello, during the model making process, the use of the automatically generated geometric center point simply represents the geometric center of the point cloud template. It does not inherently belong to any specific reference frame.

If you are using the Step “3D Matching” to recognize a workpiece, the obtained center point will be in the camera’s reference frame. If this is the picking pose for your robot, you will need to transform this result using a pose transformation into the robot’s reference frame before sending it to the robot.

Engineering Registration

2. Q1 (2023/10/23)

image

In the solution, a new project has been added. Why doesn’t this new project have the up and down arrows on the left like the other projects?

2. A1 (2023/10/23)

After creating a new project, the editing area remains in an incomplete state because there is no dragging of output ports. Therefore, it displays a “to be edited” status. Once the project is fully constructed and successfully registered, the project name will show a normal project number and the up and down arrows on the left, and the project name will be displayed in green, as shown in the screenshot below:

2. Q2 (2023/10/23)

How do I register a project?

2. A2 (2023/10/23)

The term “register a project” in this context refers to setting it as an automatic load in the solution and is unrelated to the presence of arrows.


To set a project to automatically load the current solution, right-click on the solution name and check the “Autoload Project” option.

Flange Pose

3. Q (2023/10/21)

What does setting the flange pose in Mech-Center’s Log mean? Is it something that can be configured? Can we configure it from our software, or is it configured on the robot’s side?

3. A1 (2023/10/23)

The flange pose is related to the scenario where the camera is mounted on the robot’s end-effector. It is the information that needs to be provided to the camera when capturing images. Currently, this functionality is integrated into the standard interface and does not require any modifications.

3. A2 (2023/10/23)

Hello, the flange pose typically refers to the position of the center point of the robot’s sixth-axis flange plate in the robot’s base reference frame.

Usually, in the EIH mode, the robot needs to provide the current flange pose to the vision system when capturing images. This information is transmitted from the robot to the vision system. You can refer to the TCP/IP commands and the respective robot programs for more details:

TCP/IP Interface Commands
Standard Interface Communication

SORT_BY_ROW_COL

4. Q (2023/10/23)

Is it possible to extract the “SORT_BY_ROW_COL” functionality from “Advanced Components” – “Adjust Poses” for individual use?

Or can this effect be achieved through a combination of multiple function blocks?

4. A (2023/10/23)

  1. Using the standard procedure “Sort by Two Values” can effectively achieve sorting by rows and columns.
  2. Parameter settings: The “Layer Interval” parameter for row sorting is equivalent to the “Dist Between Rows” parameter in the Step “Adjust Poses”. The “Layer Interval” parameter for column sorting generally does not require adjustment.

DI Signal, How to Obtain Robot IO Signal Status?

5. Q (2023/10/24)

image
What communication command does the robot use to obtain its IO status?

Can this command only check the robot’s input signals? Is it possible to detect the robot’s DO status as well?

5. A (2023/10/24)

To check DI (Digital Input): This command is used to detect the signal value of a specified DI port. You should set the corresponding DI port number based on the actual robot’s wiring.
Check DI

  • If you are using the main control communication method, after programming and running the vision program, you can directly check and set the robot’s IO in the workflow.
  • If you are using a standard interface, it is recommended to use a branching approach to accomplish triggering different paths.

It primarily depends on your specific needs and what you want to achieve by checking DI. The instructions for this Step are provided in the link above, and I hope this helps you.

Can a single camera trigger and send results to two robots, both operating simultaneously?

6. Q1 (2023/10/24)

Due to cycle time constraints, we need to add an extra robot. Can a single camera trigger and send results to two robots, both operating simultaneously?

The brand of robots being used is UR robots. Are there any cases or examples of this being done?

6. A1 (2023/10/24)

  1. Hello, currently, the standard interface does not support this type of communication.
  2. We recommend establishing direct communication between the vision system and a PLC (Programmable Logic Controller). The vision results can be sent from the PLC to each of the two robots separately.

6. Q2 (2023/10/24)

Does this mean that it won’t support collision detection between the two robots simultaneously?

6. A2 (2023/10/24)

Yes, that is correct.

Mech-Center Log Always Shows Camera Disconnection and Reconnection

7. Q (2023/10/25)

  • Software Version: Mech-Vision 1.7.2.

  • Camera Model and Firmware Version:

    • Model: Mech-Eye NANO.
    • IP: 192.168.5.20:5577(Static)
    • Firmware: 2.1.0 31f1fcf28
  • Problem Description
    During production, Mech-Center constantly displays camera disconnection and then reconnection.

  • Attempted Solutions or Checked Content Explanation:

    1. Tightened the network cable on the camera side.
    2. Restarted the camera’s power.
    3. Restarted the router.

7. A (2023/10/25)

Hello,

Regarding the camera disconnection issue, please pay attention to whether the disconnect times coincide with the generation of Mech-Eye logs. This can help determine whether the problem is related to the network or the power supply.

You can refer to the following link for more information:
Main causes and solutions for camera disconnections

For specific details about camera wiring, please refer to:
Guidelines for mounting camera cables

First, focus on the above troubleshooting steps for the current problem, and continue to monitor the camera’s behavior. If the issue persists, you can contact the corresponding support personnel for a more in-depth analysis and investigation.

Using API Command 102 to Return Data

8. Q (2023/10/25)

  • Software Version: Mech-Vision 1.7.4, Mech-Viz 1.7.4, Mech-Eye Viewer 2.1.0
  • Robotic Arm: ABB IRB 1200 707

When using the 102 command to receive data parameters, does the TCP of the visual target point include three-dimensional coordinates (XYZ, in millimeters) and Euler angles (ABC, in degrees)? Is the Euler angle order still Rx Ry Rz, or does it vary depending on the robotic arm?

8. A (2023/10/25)

The Euler angles returned will vary according to the specific robotic arm used.

Camera Information

9. Q1 (2023/10/26)

Do you have detailed technical specifications for the PRO XS camera?

9. A1 (2023/10/26)

Hello, you can refer to the camera information in the download center: Camera materials.

9. Q2 (2023/10/27)


There is no hardware user manual for PRO XS. I’ve checked.

9. A2 (2023/10/27)

We’re sorry! The hardware user manual for PRO XS is still under production and should be available online in approximately two weeks.

How to Import Images Obtained from Mech-Eye Viewer into Mech-Vision for Processing

10. Q (2023/10/27)

I have a question:

I want to import images obtained from Mech-Eye Viewer into Mech-Vision for processing, but I encountered an error at the Step “3D Coarse Matching V2”, indicating that the template point cloud contains non-compliant normals. What does this error mean?

How can I properly transform images obtained from Mech-Eye Viewer for normal usage in Mech-Vision?


10. A (2023/10/27)

Images directly obtained from Mech-Eye Viewer do not contain normal information. Here are two solutions:

  1. Add normal information to the point cloud. Software like Meshlab allows you to generate normals when exporting.
  2. In the folder where you saved the images, you actually have color and depth images. You can use Mech-Vision to read the depth map and follow these Steps: “From Depth Map to Point Cloud”, “Calc Normals of Point Cloud and Filter It”, and “Save Results to File.” This will generate a point cloud with normal information, which you can then use with the “Matching Model and Pick Point Editor” to create a point cloud model.

Based on a PLY file, there are two methods for calculating normals:

  1. Calculating Normals using Meshlab:

    • Open the point cloud file in Meshlab.
    • In the menu bar, select “Filters” > “Point Set” > “Compute Normals”.
    • In the pop-up window, click “Apply”.
    • To save the project, go to the menu bar and select “File” > “Save Project As”, and save it as a point cloud file.
  2. Calculating Normals using Mech-Vision:

    • Open Mech-Vision and choose “Read Point Cloud V2”.
    • Select “Smooth Point Cloud and Estimate Normals”.
    • Save the results to a file.
    • In the Step “Smooth Point Cloud and Estimate Normals”, set the Search Radius to 0.5 mm.
    • If you can’t find “Smooth Point Cloud and Estimate Normals” in the Step library, expand all steps and search for it.

Sorting Workpieces from Outer Ring to Inner Ring

11. Q (2023/10/26)

How can workpieces in a pallet be sorted and picked up in the order of first picking the outer ring and then the inner ring?

11. A (2023/10/27)

Hello, you can achieve this by using the Step “Adjust Poses” and sorting based on the predetermined center point’s proximity.