Because it’s just silly AI generated spam, don’t read too much into it.
ares623 3 days ago [-]
aren't these supposed to be bannable offences?
ThrowawayR2 3 days ago [-]
They are but somebody has to bring it to the attention of the HN moderators by emailing them for that to happen.
girvo 3 days ago [-]
They are, dang et al deal with it pretty strictly. Which I thank them for.
october8140 3 days ago [-]
This was posted earlier this week.
girvo 3 days ago [-]
I don’t get it. It says nothing leaves your computer, but it’s sending things to OpenRouter, not running models locally. Perhaps I am dumb (and I always feel dumb after reading an AI generated README for yet another AI tool Tbf)
3s 3 days ago [-]
Yes it appears your personal data IS being sent to open router and the model provider here. The problem I think is that a lot of people (especially in the openclaw community) mistake “I run it on my mac mini” to mean their data is private. Meanwhile all data is being shipped off for training to anthropic via openrouter and both of those parties see everything.
I guess you could theoretically plug in a local model here but of course the readme should be more precise here when talking about privacy
hrmtst93837 3 days ago [-]
[dead]
_ache_ 3 days ago [-]
> Yes. OpenYak is local-first. Your conversations and files are stored only on your machine. When using cloud models, only API calls to LLM providers leave your computer.
So local-first and still upload files to cloud models if you configure it.
Barbing 3 days ago [-]
>only API calls
Given the software‘s broad appeal, I’d rephrase to make it more clear every word/file you send would leave your computer.
_ache_ 3 days ago [-]
Absolutely, that is misleading to not-much-technical people, maybe intentionally.
Does not inspire confidence.
gbalduzzi 3 days ago [-]
I read it as "everything controlled by us is local first and we do not collect any data about you"
I agree that someone may misunderstand their phrasing though
hrmtst93837 3 days ago [-]
You're reading it correctly: it's a thin OpenRouter wrapper calling itself local while your prompts still leave the machine.
jstummbillig 3 days ago [-]
> It says nothing leaves your computer
Where does it say that?
It sends to OpenRouter if you chose to use OpenRouter. Can use Ollama. Idk how to get more local than that? Any tool will be non-local, when you do something explicitly non-local.
raincole 3 days ago [-]
> run locally via Ollama
Are you saying this part is a lie?
girvo 3 days ago [-]
Yes.
In that I'm saying their AI generated slop README is misleading enough to be lying.
When you prominently say "all without uploading anything to the cloud." but the default does exactly that, a single half-sentence mentioning Ollama doesn't cut it.
Barbing 3 days ago [-]
This is a weird phenomenon with some dictation apps too.
They talk about privacy first, they talk up local first, then their default settings are to send every syllable to someone else’s computer. Once you understand the app it’s trivial to make it a local, but there’s a good chance your first transcript is coming off a server unlike what the marketing material suggested.
teleforce 3 days ago [-]
I've got the strong feeling that AI model and agent requires different operating system (OS) paradigm that's data centric rather than file-system for more efficient, effective and trustworthy operations. This new OS should work seamlessly with data natively across different processors for examples CPU, GPU, TPU, NPU, accelarators, etc.
For working example, please check TabulaROSA (Tabular Operating System Architecture) proposed by the MIT team. Instead of normal OS system call, it utilizes data based operations with D4M that can work mathematically via associative array with structured or non-structered data [1],[2].
With the advent of new CPU acceleration with fully homomorphic encryption as demonstrated by Intel, the AI model and agent can even analyze the data without even decrypting them [3],[4].
[1] TabulaROSA: Tabular Operating System Architecture for Massively Parallel Heterogeneous Compute Engines
What do you mean by interfaces in "These interfaces can do literally anything on the host machine. You're responsible for your own security"?
Also, your backdooring image links to a 404.
rbren 3 days ago [-]
The prompts contain e.g. a terminal UI, which gives you root access to the machine. If someone can access that UI and its backend, the can do whatever they want! So make sure to put it behind a firewall or basic auth or something else.
Barbing 3 days ago [-]
Thanks for the link. You mention security; is the _average_ developer safer going with OpenClaw?
rbren 3 days ago [-]
It's probably just as hard to secure OpenClaw as this, but you'll find better tutorials for securing OpenClaw
nimchimpsky 3 days ago [-]
[dead]
zombot 3 days ago [-]
> owns your filesystem
Just when I thought it couldn't get worse than OpenClaw, someone proposes this, in all seriousness. I see a stellar future for them at OpenAI.
rakag 3 days ago [-]
What's the difference between this and OpenWork which has existed for a while?
jaimex2 3 days ago [-]
OpenWork supports Linux where this does not
SilverElfin 3 days ago [-]
What does “owns your filesystem” mean? That sounds dangerous.
h05sz487b 3 days ago [-]
Its your filesystem which is now, also, owned.
spiderfarmer 3 days ago [-]
I have used Cowork so much over the last couple of months and I have no reason to switch. But I’ll definitely give this a try.
kvakkefly 3 days ago [-]
Nice! MacOS download link is a 404
imiric 3 days ago [-]
It looks like HN's new AI guideline is working as intended.
SomaticPirate 3 days ago [-]
Not to be too conspiratorial here but since the founder of OpenClaw was snatched up, there seems to be a rush of “open source” AI projects desperately bidding to be alternatives. Which can generate huge returns if one of the major players decides that “they also need a cowork-style product”
So its uniquely viable to be a sellout here and attempt to clone a major lab’s attempt on the off-chance you get acquired later
the_real_cher 3 days ago [-]
So it's like open claw but you have to pay for it?
kennywinker 3 days ago [-]
It looks free / open source to me?
Factor1177 3 days ago [-]
Anyone else getting a 404 when trying to download?
systima 3 days ago [-]
How does this differ to Open Code Desktop?
jaimex2 3 days ago [-]
This doesn't support Linux where Open Code does.
Sathwickp 3 days ago [-]
A simpler version of openclaw?
october8140 3 days ago [-]
Flag this garbage.
wangzhangwu 3 days ago [-]
[dead]
firekey_browser 3 days ago [-]
[dead]
orangmisterius 3 days ago [-]
[dead]
wangzhangwu 3 days ago [-]
[dead]
Rendered at 10:51:16 GMT+0000 (UTC) with Wasmer Edge.
https://news.ycombinator.com/item?id=47560380#47560381
When it's clear he is one of the major contributors to the project?
https://github.com/openyak/desktop/graphs/contributors
I guess you could theoretically plug in a local model here but of course the readme should be more precise here when talking about privacy
So local-first and still upload files to cloud models if you configure it.
Given the software‘s broad appeal, I’d rephrase to make it more clear every word/file you send would leave your computer.
I agree that someone may misunderstand their phrasing though
Where does it say that?
It sends to OpenRouter if you chose to use OpenRouter. Can use Ollama. Idk how to get more local than that? Any tool will be non-local, when you do something explicitly non-local.
Are you saying this part is a lie?
In that I'm saying their AI generated slop README is misleading enough to be lying.
When you prominently say "all without uploading anything to the cloud." but the default does exactly that, a single half-sentence mentioning Ollama doesn't cut it.
They talk about privacy first, they talk up local first, then their default settings are to send every syllable to someone else’s computer. Once you understand the app it’s trivial to make it a local, but there’s a good chance your first transcript is coming off a server unlike what the marketing material suggested.
For working example, please check TabulaROSA (Tabular Operating System Architecture) proposed by the MIT team. Instead of normal OS system call, it utilizes data based operations with D4M that can work mathematically via associative array with structured or non-structered data [1],[2].
With the advent of new CPU acceleration with fully homomorphic encryption as demonstrated by Intel, the AI model and agent can even analyze the data without even decrypting them [3],[4].
[1] TabulaROSA: Tabular Operating System Architecture for Massively Parallel Heterogeneous Compute Engines
https://dspace.mit.edu/handle/1721.1/126114
[2] D4M: Dynamic Distributed Dimensional Data Model:
https://d4m.mit.edu/
[3] Intel Demos Chip to Compute with Encrypted Data (121 comments):
https://news.ycombinator.com/item?id=47322815
[4] Intel Demos Chip to Compute With Encrypted Data: Fully homomorphic encryption chip speeds operations 5,000-fold:
https://spectrum.ieee.org/fhe-intel
Here are the prompts I use for my AI environment, though it's changed a bunch since the last snapshot
https://github.com/rbren/personal-ai-devbox
What do you mean by interfaces in "These interfaces can do literally anything on the host machine. You're responsible for your own security"?
Also, your backdooring image links to a 404.
Just when I thought it couldn't get worse than OpenClaw, someone proposes this, in all seriousness. I see a stellar future for them at OpenAI.
So its uniquely viable to be a sellout here and attempt to clone a major lab’s attempt on the off-chance you get acquired later