(286d) Images, Motion Capture, Three-Dimensional Modeling, and Haptics for Future Manufacturing | AIChE

(286d) Images, Motion Capture, Three-Dimensional Modeling, and Haptics for Future Manufacturing

Authors 

Durand, H. - Presenter, Wayne State University
Gjonaj, G., Wayne State University
O'Neill, R., Wayne State University
Rahman, M., Wayne State University
Headley, J., Wayne State University
Beacham, P., Wayne State University
Cherney, S., Wayne State University
Williamson, M., Wayne State University
Many advances related to the field of computer graphics are ushering in a large number of ideas for the future of manufacturing. For example, virtual reality has been discussed in the context of training [1]. However, as a large number of advances for automation and efficiency are contemplated for the process industries, it becomes interesting to consider how one might test and evaluate whether these technologies will be beneficial for a given process, or otherwise to test how they might be designed to integrate effectively with other considerations at a plant such as humans, robotic systems, or artificially intelligent decision-making systems.

This talk presents case studies covering a variety of topics of relevance to next-generation manufacturing that fall broadly within contexts of relevance to computer graphics. Our goal is to specifically investigate how simulations of future manufacturing systems might be carried out. To reduce the complexity of these initial investigations, we perform the studies within the context of the open-source computer graphics software toolset Blender that has capabilities for image generation, Python scripting, and animation to facilitate the creation of scenes involving human-plant interaction, processes where images might play an integral role in control, or where the three-dimensional plant context is important to taking actions [2]. The first study to be presented will focus on how Blender might be used to create an image-based control simulation for a zinc flotation process. This process has previously been analyzed in an image-based control context (e.g., [3]). We first discuss how Blender can be used for simulating the visual movement of bubbles in the flotation process by modeling equations of velocity and position in the Python programming interface and using these to manipulate spheres in the 3D viewport with keyframes used to set the animation. We demonstrate what a render of the froth then looks like to showcase how Blender can be used for generation of the image to be used in the control design. We describe how the Python programming interface can be used to access images with tools such as OpenCV [4] that then allow operations to be performed on the images that could aid in image-based control. We then discuss how the model must be further modified by synthesizing information and inspiration from works which model both the physics (without graphics) of a flotation process [5] and those which model bubbles without flotation physics [6].

We then discuss several other advances that can be used to assess the use of Blender. For example, we showcase how chemical engineers might take plant data and reference images to seek to build the basic layout of a plant, and then also discuss how Blender can be used as a test framework for image-based safety monitoring algorithms that could generate many potential actions of a human when no plant data is available (as would be the typical case in an academic setting), and then use this data as if it was captured data from a plant to assess algorithms such as motion capture integrated with safety assessment. We also present some perspective on how evaluating haptic considerations might be evaluated with the aid of Blender.

[1] Fracaro, S. G., Chan, P., Gallagher, T., Tehreem, Y., Toyoda, R., Bernaerts, K., Glassey, J., Pfeiffer, T., Slof, B., Wachsmuth, S., & Wilk, M. (2021). Towards design guidelines for virtual reality training for the chemical industry. Education for Chemical Engineers, 36, 12-23.

[2] Oyama, H., D. Messina, R. O'Neill, S. Cherney, M. Rahman, K. K. Rangan, G. Gjonaj, and H. Durand, “Test Methods for Image-Based Information in Next-Generation Manufacturing,” Proceedings of the IFAC Symposium on Dynamics and Control of Process Systems, including Biosystems (DYCOPS), Busan, Republic of Korea, in press.

[3] Kaartinen, J., Hätönen, J., Hyötyniemi, H., & Miettunen, J. (2006). Machine-vision-based control of zinc flotation—a case study. Control Engineering Practice, 14(12), 1455-1466.

[4] Bradski, G. (2000). The OpenCV Library. Dr. Dobb’s Journal of Software Tools.

[5] Do, H. (2010). Development of a turbulent flotation model from first principles (Doctoral dissertation, Virginia Tech).

[6] Cleary, P. W., Pyo, S. H., Prakash, M., & Koo, B. K. (2007). Bubbling and frothing liquids. In ACM SIGGRAPH 2007 papers (Article 97).