**Navigation:** [[System (Process)]] | [[Research Question]] | [[System (Keywords)]]
**Related:** [[Uncanny Valley]] | [[Machine Point Of View]] | [[Robots (Research)]] | [[Evolution of Adaptivity, Autonomy & Responsibility (Theory)]]
---
# Machine Embodiment
**Created:** [[2025]] — **Location:** Banská Bystrica
**Collaborators:** [MediaLab*](https://www.vsvu.sk/en/departments/department-of-visual-communication/studio-multimedia/), [Divadlo Štúdio Tanca](https://www.studiotanca.sk/)
**Medium:** [[Performance]] [[Event]] [[Visual]] [[Sound]]
---
## Project Description
Machine Embodiment explores the relationship between human and machine through performance, investigating how physical presence transforms our perception of artificial agents. The project examines the boundary between tool and actor through direct bodily interaction with robotic systems.
### Key Questions
- How does embodied interaction change our relationship with machines?
- What happens when human gestures directly control robotic movements?
- Can we experience machine "perspectives" through our own bodies?
### Performance Elements
- **Real-time control** of robotic limbs through human movement
- **Gestural mapping** from performer to machine
- **Feedback loops** between human intention and machine response
- **Audience observation** of human-machine coupling
## Theoretical Framework
### Embodied Cognition
Drawing from theories that cognition emerges from body-environment interaction:
- Intelligence is not purely computational but situated in physical experience
- The body is not merely hardware for the mind but an active agent in perception
- Machine intelligence may require embodied experience to achieve genuine understanding
### Tool vs. Actor
The performance reveals a crucial transformation:
- **With direct control:** Machine perceived as extension/tool of human
- **Without human input:** Machine perceived as autonomous actor
- **Transition moment:** The shift in status is immediate and striking
## Observations
### Human-Machine Coupling
When performer directly controls robotic movements:
- **Reduced distance** between human and machine
- **Sense of extension** rather than separate entity
- **Shared agency** in the resulting actions
### Autonomous Operation
When machine acts independently:
- **Immediate status change** to autonomous actor
- **Unpredictability** becomes a key factor
- **Anthropomorphic attribution** of intentions
### Audience Response
Spectators show different reactions to:
- **Coupled operation:** Focus on human performer
- **Autonomous operation:** Focus shifts to machine as independent entity
- **Switching between modes:** Highlights the constructed nature of agency
## Connection to System Project
This performance work directly informs the broader System project by:
1. **Testing embodiment theories** in practice
2. **Exploring human-machine boundaries** through direct experience
3. **Investigating perception of machine agency** in real-time
4. **Understanding the role of physical presence** in artificial systems
## Technical Implementation
### Hardware
- Custom robotic limbs and actuators
- Motion capture system for gesture tracking
- Real-time processing for movement mapping
- Wireless communication protocols
### Software
- Real-time gesture analysis
- Inverse kinematics for robotic control
- OSC communication between systems
- Performance data logging
---
## Media Documentation
### Video
<div style="padding:133.33% 0 0 0;position:relative;"><iframe src="https://player.vimeo.com/video/1066164767?badge=0&autopause=0&player_id=0&app_id=58479" frameborder="0" allow="autoplay; fullscreen; picture-in-picture; clipboard-write; encrypted-media" style="position:absolute;top:0;left:0;width:100%;height:100%;" title="System (Machine Embodiment)"></iframe></div><script src="https://player.vimeo.com/api/player.js"></script>
### Images
![[embodiment_dst2.jpg]]
*Performance setup showing human-robot interaction*
![[embodiment_dst3.jpg]]
*Audience observing the coupling between performer and machine*
![[walker_01.jpg]]
*Robotic limb responding to human gesture*
![[walker_03.jpg]]
*Detail of robotic mechanism*
![[walker_04.jpg]]
*Real-time gesture mapping in action*
![[walker_02.jpg]]
*Performer controlling multiple robotic elements*
![[walker_05.jpg]]
*Machine operating autonomously*
---
## Performance Log
*[This section could include dates, venues, audience feedback, technical notes, etc.]*
---
**See also:** [[System (Process)]] | [[Robots (Research)]] | [[Evolution of Adaptivity, Autonomy & Responsibility (Theory)]]