Feels like the kind of headline that would briefly pause in front of the camera to establish how the world got "like this".
rybosome 11 hours ago [-]
I have far more ideas about this than time to execute it, but for a long time I’ve had this fantasy about a robot bandmate.
The idea is I’d go on stage singing and playing guitar with a looper and some samples, then bring a robot toy and introduce the robot “controlling” the looping and sampling as the bandmate.
It’s a gimmick that’s been done before, but with LLMs driving verbal interaction and now I could use this to animate a robot…it becomes pretty compelling. I’d plug the LLMs into the audio feed so I could banter with it and get responses then have the robot avatar animate accordingly.
If only my full time job saw value in this project.
yard2010 7 hours ago [-]
Imagine when you don't need money anymore because everything is automated to oblivion. Everything is affordable. So the kind of people like you won't have to work to make a living, you just do your art instead. Better for everyone!
dathos 6 hours ago [-]
I cannot understand this optimism, in my industry the profits of automation only flow upwards.
brookst 50 minutes ago [-]
While the profits of tech have also flowed upwards, even average to poor people have much improved quality of life from tech.
I’d prefer much less wealth inequality, but it’s not like the only benefit of automation is profit.
kevindamm 2 hours ago [-]
My understanding is that you need the optimists to mention it enough times before the world is ready, because it normalizes the concept in ways that not mentioning it ever (or mentioning it only cynically) wouldn't be able to.
computerthings 6 hours ago [-]
[dead]
chrisdalke 13 hours ago [-]
Cool! As a moonshot fun idea I’ve been interested in MCP as a way to use informal conversations to task robots. I’ll have to play around with this!
One example on unmanned boats: a human could radio to the boat over VHF and say “move 100 meters south”… that speech-to-text would feed to an LLM which extracts the meaning and calls the MCP.
I’ll have to install this and play around.
sfeldma 12 hours ago [-]
Ya, sounds like a good idea to let the LLM do all the calculations and send simple instructions to boat. MCP tells it what data is available from the device.
The location of the "GPS nano" device is:
Latitude: 30.448336
Longitude: -91.12896
speerer 8 hours ago [-]
This seems like a residential address. Irrespective of whether the creator deliberately exposed it, I would be a little bit cautious about sharing it further.
The idea is I’d go on stage singing and playing guitar with a looper and some samples, then bring a robot toy and introduce the robot “controlling” the looping and sampling as the bandmate.
It’s a gimmick that’s been done before, but with LLMs driving verbal interaction and now I could use this to animate a robot…it becomes pretty compelling. I’d plug the LLMs into the audio feed so I could banter with it and get responses then have the robot avatar animate accordingly.
If only my full time job saw value in this project.
I’d prefer much less wealth inequality, but it’s not like the only benefit of automation is profit.
One example on unmanned boats: a human could radio to the boat over VHF and say “move 100 meters south”… that speech-to-text would feed to an LLM which extracts the meaning and calls the MCP.
I’ll have to install this and play around.
I tried the MCP server with the demo (https://merliot.io/demo) using Cursor and asked:
What is the location of the "GPS nano" device?
The location of the "GPS nano" device is: Latitude: 30.448336 Longitude: -91.12896
https://www.marksetbot.com/
(not affiliated, just a fan)