[-] ofcourse@lemmy.ml 19 points 3 weeks ago* (last edited 3 weeks ago)

The bill excludes holding responsible creators of open source models for damages from forked models that have been significantly altered.

[-] ofcourse@lemmy.ml 35 points 3 weeks ago* (last edited 3 weeks ago)

The criticism from large AI companies to this bill sounds a lot like the pushbacks from auto manufacturers from adding safety features like seatbelts, airbags, and crumple zones. Just because someone else used a model for nefarious purposes doesn’t absolve the model creator from their responsibility to minimize that potential. We already do this for a lot of other industries like cars, guns, and tobacco - minimize the potential of harm despite individual actions causing the harm and not the company directly.

I have been following Andrew Ng for a long time and I admire his technical expertise. But his political philosophy around ML and AI has always focused on self regulation, which we have seen fail in countless industries.

The bill specifically mentions that creators of open source models that have been altered and fine tuned will not be held liable for damages from the altered models. It also only applies to models that cost more than $100M to train. So if you have that much money for training models, it’s very reasonable to expect that you spend some portion of it to ensure that the models do not cause very large damages to society.

So companies hosting their own models, like openAI and Anthropic, should definitely be responsible for adding safety guardrails around the use of their models for nefarious purposes - at least those causing loss of life. The bill mentions that it would only apply to very large damages (such as, exceeding $500M), so one person finding out a loophole isn’t going to trigger the bill. But if the companies fail to close these loopholes despite millions of people (or a few people millions of times) exploiting them, then that’s definitely on the company.

As a developer of AI models and applications, I support the bill and I’m glad to see lawmakers willing to get ahead of technology instead of waiting for something bad to happen and then trying to catch up like for social media.

[-] ofcourse@lemmy.ml 26 points 1 month ago

If I understand correctly from the article, you have to enter ‘OOBE\BYPASSNRO’ in command prompt during installation to prevent it from asking to connect to internet. If that’s the only way to set up a local account, that’s hardly an accessible option.

[-] ofcourse@lemmy.ml 51 points 1 month ago* (last edited 1 month ago)

Wouldn’t it be possible to buy a new PC, open the box, and return it right after because you cannot set it up without internet?

If enough people do it, may be PC manufacturers will force Microsoft to add offline setups.

[-] ofcourse@lemmy.ml 54 points 1 month ago

New trade restrictions coming soon on Chinese pharmaceuticals.

[-] ofcourse@lemmy.ml 104 points 1 month ago

One of the funniest things about most of these companies enforcing RTO is that their “on-site interviews” are still virtual. So you believe being in-person is more effective except when it comes to paying for travel expenses for interviewees.

Just shows the massive hypocrisy behind these RTO mandates.

[-] ofcourse@lemmy.ml 224 points 4 months ago

I reached out to Roku support regarding this. The rep told me “why are you complaining. You are the only one.” He then disconnected the chat. I’ve reached out to my state’s AG to report this. No action so far but waiting. If there are enough complaints, that might help move the needle.

What Roku is doing should be completely illegal - bricking the product after purchasing it for full price if you don’t agree to waiving your rights.

[-] ofcourse@lemmy.ml 43 points 5 months ago

FMLA. I start reading it as fuck my life before realizing it’s the family and medical leave allowance. So much hinges on that extra A.

[-] ofcourse@lemmy.ml 18 points 8 months ago

If the Democrats had voted to keep McCarthy as the speaker despite McCarthy’s previous record of uncooperation and reneging, it would have signaled to him and the republicans that the democrats can be pushed back further. It would have been a disastrous move politically for the democrats. Remember, McCarthy voted against certifying the election results so he did not have a great record of upholding democratic values and could not be trusted to negotiate in good faith.

[-] ofcourse@lemmy.ml 65 points 8 months ago

He took the plea deal probably because he can’t afford a lengthy legal battle, decline to pay his lawyers, getting court dates shifted, appointing his own judge,… There is an entirely separate US justice system for the rich and powerful.

[-] ofcourse@lemmy.ml 14 points 8 months ago

The GitHub copilot example seems to indicate it’s a pricing problem. In fact this situation might indicate that users are finding it so useful that they are using it more than MS expected when they set up their monthly subscriptions. Over time, models are going to be optimized and costs will reduce.

Expecting AI to take over all human intensive tasks is not realistic but eventually it’s going to become part of a lot of repetitive tasks. Though I hope that we see more open source base models instead of the current situation with 3-4 major companies providing the base models behind most of the AI applications.

[-] ofcourse@lemmy.ml 78 points 9 months ago

If CA decriminalizes it, everyone would be looking toward the state to see its success or failure. Opponents would try to find any excuse to shut it down whether in CA or other states. So if we can set up guidelines and necessary infrastructure for safe use, both medically and recreationally, it would be better for long term success of psylocybin legalization.

2
submitted 1 year ago by ofcourse@lemmy.ml to c/funny@lemmy.ml
view more: next ›

ofcourse

joined 1 year ago