Designing Engaging Interactive Animatronics: Strategies and Implementation Insights

Year : 2024 | Volume :01 | Issue : 02 | Page : 27-33
By

Netra M. Deshpande

Krishna T. Madrewar

  1. Student Department of Electronics & Telecommunication Engineering, Deogiri Institute of Engineering and Management Studies College, Chhatrapati Sambhajinagar Maharashtra India
  2. Assistant Professor Department of Electronics & Telecommunication Engineering, Deogiri Institute of Engineering and Management Studies College, Chhatrapati Sambhajinagar Maharashtra India

Abstract

The field of animatronics uses robotic systems to imitate realistic movements and activities in mechanical figures. To enhance the recognition and performance of the animatronic as a relationship mediator, it is important that the animatronics in recreational games be perceived as human companions to create productive interaction scenarios. This study describes the design and implementation of a Java- based programming method, an Adriano microcontroller, an open environment, a depth camera (Carmine by Prime Sense), and human recognition of an electric dragon. A person’s approximate skeletal data was discovered using information from the depth camera. The programme will monitor one or more people within the camera’s field of view based on how the event moves. Among those observed, shared tasks were categorised as private. It has numerous realistic effects, such as sound, movement, and all different kinds of human and dragon joints. Further investigation into the efficacy of interactive content in the learning environment will be made possible by this animatronic design.

Keywords: Animatronics, Microcontroller, MATLAB, neural network, Multichannel servo controllers, facial pose

[This article belongs to International Journal of Optical Innovations & Research(ijoir)]

How to cite this article: Netra M. Deshpande, Krishna T. Madrewar. Designing Engaging Interactive Animatronics: Strategies and Implementation Insights. International Journal of Optical Innovations & Research. 2024; 01(02):27-33.
How to cite this URL: Netra M. Deshpande, Krishna T. Madrewar. Designing Engaging Interactive Animatronics: Strategies and Implementation Insights. International Journal of Optical Innovations & Research. 2024; 01(02):27-33. Available from: https://journals.stmjournals.com/ijoir/article=2024/view=0

References

  1. Burns and B. Samanta, “Design and implementation of an interactive animatronic for guest response analysis,” SoutheastCon 2015, Fort Lauderdale, FL, USA, 2015, pp. 1-2, doi: 10.1109/SECON.2015.7132899.
  2. Burns B, Samanta B. Mechanical Design and Control Calibration for an Interactive Animatronic System. InASME International Mechanical Engineering Congress and Exposition 2015 Nov 13 (Vol. 57403, p. V04BT04A029). American Society of Mechanical Engineers.
  3. Fitzpatrick RJ. Designing and constructing an animatronic head capable of human motion programmed using face-tracking software (Doctoral dissertation, Worcester Polytechnic Institute).
  4. Yim S, Sung C, Miyashita S, Rus D, Kim S. Animatronic soft robots by additive folding. The International Journal of Robotics Research. 2018 May;37(6):611-28.
  5. Castellon J, Bächer M, McCrory M, Ayala A, Stolarz J, Imagineering WD, Mitchell K. Active learning for interactive audio animatronic. Journal of Computer Graphics Techniques Vol. 2020;9(3).
  6. Laszlo J, van de Panne M, Fiume E. Interactive control for physically based animation. InProceedings of the 27th annual conference on Computer graphics and interactive techniques 2000 Jul 1 (pp. 201-208).
  7. Zaidan AH, Wail MK, Yaseen AA. Design and implementation of upper prosthetic controlled remotely by flexible sensor glove. InIOP conference series: materials science and engineering 2021 Jun 1 (Vol. 1105, No. 1, p. 012080). IOP Publishing.
  8. Prescott ER, Shooter SB, Meiser J. Persistent tracking and monitoring of animatronics using IoT capabilities. In2017 IEEE 7th Annual International Conference on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER) 2017 Jul 31 (pp. 1643-1648). IEEE.
  9. Guerrero-Rincon C, Uribe-Quevedo A, Leon-Rodriguez H, Park JO. Hand-based tracking animatronics interaction. InIEEE ISR 2013 2013 Oct 24 (pp. 1-3). IEEE.
  10. Nakadai H, Hirano T, Hoshino JI. Real-Time Expression Control System for Wearable Animatronics. InEntertainment Computing–ICEC 2017: 16th IFIP TC 14 International Conference, Tsukuba City, Japan, September 18-21, 2017, Proceedings 16 2017 (pp. 439-442). Springer International Publishing.

Regular Issue Subscription Review Article
Volume 01
Issue 02
Received January 17, 2024
Accepted March 20, 2024
Published April 29, 2024

function myFunction2() {
var x = document.getElementById(“browsefigure”);
if (x.style.display === “block”) {
x.style.display = “none”;
}
else { x.style.display = “Block”; }
}
document.querySelector(“.prevBtn”).addEventListener(“click”, () => {
changeSlides(-1);
});
document.querySelector(“.nextBtn”).addEventListener(“click”, () => {
changeSlides(1);
});
var slideIndex = 1;
showSlides(slideIndex);
function changeSlides(n) {
showSlides((slideIndex += n));
}
function currentSlide(n) {
showSlides((slideIndex = n));
}
function showSlides(n) {
var i;
var slides = document.getElementsByClassName(“Slide”);
var dots = document.getElementsByClassName(“Navdot”);
if (n > slides.length) { slideIndex = 1; }
if (n (item.style.display = “none”));
Array.from(dots).forEach(
item => (item.className = item.className.replace(” selected”, “”))
);
slides[slideIndex – 1].style.display = “block”;
dots[slideIndex – 1].className += ” selected”;
}