AmbiguousProps

joined 1 year ago
[–] AmbiguousProps 2 points 12 hours ago

I can't put my finger on it, but I don't trust System76 anymore. It could be because of their development halt of Pop while they worked on Cosmic leaving a bad taste in my mouth, but nonetheless, I don't think I'd buy hardware from them, especially at this price.

[–] AmbiguousProps 4 points 13 hours ago

The (wildcard) certs are the same, as it's what caddy is pulling via API. You can either build the cloudflare module into caddy via docker build, or use a prebuilt version. It doesn't create two separate certs for local and remote.

It works really well for me, and is actually the most straight forward way to get valid certs for internal services I've found. Since they're wildcard, my internal domains don't get exposed through certificate authorities.

[–] AmbiguousProps 1 points 13 hours ago

It would probably take days to rebuild the array.

It's important to also note that RAID (or alternatives such as unRAID) are not backup systems and should not be relied on as such. If you have a severe brownout that fries more than two or three drives at once, for example, you will lose data if you're not backing up.

[–] AmbiguousProps 7 points 13 hours ago* (last edited 13 hours ago) (2 children)

Nah, as a fellow data hoarder you're 100% correct. I have a couple of dozen disks, and I've had failures from both Seagate and WD, but the Seagates have failed much more often. For the past couple of years, I've only purchased WD for this reason. I'm down to two Seagate drives now.

I feel like many people with a distaste for WD got burned by the consumer drives (especially the WD Greens). WD's DC line is so good though, especially HC530.

[–] AmbiguousProps 1 points 13 hours ago

Efficiency still matters very much when self hosting. You need to consider power usage (do you have enough amps in your service to power a single GPU? probably. what about 10? probably not) and heat (it's going to make you need to run more A/C in the summer, do you have enough in your service to power an A/C and your massive amount of GPUs? not likely).

Homes are not designed for huge amounts of hardware. I think a lot of self hosters (including my past self) can forget that in their excitement of their hobby. Personally, I'm just fine not running huge models at home. I can get by with models that can run on a single GPU, and even if I had more GPUs in my server, I don't think the results (which would still contain many hallucinations) would be worth the power cost, strain on my A/C, and possible electrical overload.

[–] AmbiguousProps 5 points 14 hours ago (2 children)

You can use caddy to get internal https via cloudflare API, and no traffic needs to go through a cloudflare tunnel for that.

[–] AmbiguousProps 14 points 14 hours ago (1 children)

Under the proposed measure, a person registering for the first time must complete the voter registration form as well as have one of the required documents. For existing voters, county auditors, by July 1, 2027 must work with the state Department of Licensing to see which ones have an enhanced driver’s license or identicard. Those that do remain registered automatically.

Those who do not will get a notice to go to their county auditor’s office and present one of the approved documents to show they are a citizen. Reminder notices must be sent. A voter has up until 14 days before the November 2027 election to produce documents.

So those that don't have enhanced IDs will have their right to vote stripped and will be confused when they don't receive a ballot. By the time they give in to investigate, it'll be too late to vote.

It's a voter suppression initiative, shocker.

[–] AmbiguousProps 1 points 1 day ago

I don't think it's very relevant to the discussion of drug dealers using Graphene.

[–] AmbiguousProps 25 points 1 day ago

Yep, disabling it entirely allows for charging when the device is off, but otherwise, it is functionally useless and is disabled at the hardware level.

[–] AmbiguousProps 33 points 1 day ago

You can install Graphene from the browser, it's really not a huge hassle to install especially if you do it right when you get the phone.

[–] AmbiguousProps 8 points 2 days ago (1 children)

Sir, this is a Wendy's.

[–] AmbiguousProps 1 points 5 days ago

I broke my tibia while bouldering, so be careful and warm up before you go.

 

With the recent first light milestone for the Vera Rubin Observatory, it's only a matter of time before one of astronomy's most long-awaited surveys begins. The Legacy Survey of Space and Time (LSST) is set to start on November 5, and will scan the sky of billions of stars for at least ten years.

One of the most important things it aims to find is evidence (or lack thereof) of primordial black holes (PBHs), one of the primary candidates for dark matter. A new paper posted to the arXiv preprint server by researchers at Durham University and the University of New Mexico looks at the difficulties the LSST will have in finding those enigmatic objects, especially the statistical challenges, and how they might be overcome.

 

Japan on Sunday successfully launched a climate change monitoring satellite on its mainstay H-2A rocket, which made its final flight before it is replaced by a new flagship model designed to be more cost competitive in the global space market.

The H-2A rocket lifted off from the Tanegashima Space Center in southwestern Japan, carrying the GOSAT-GW satellite as part of Tokyo's effort to mitigate climate change. The satellite was safely separated from the rocket and released into a planned orbit about 16 minutes later.

Scientists and space officials at the control room exchanged hugs and handshakes to celebrate the successful launch, which was delayed by several days due to a malfunctioning of the rocket's electrical systems.

Keiji Suzuki, a Mitsubishi Heavy Industries official in charge of rocket launch operations, said he was more nervous than ever for the final mission of the rocket, which has been his career work. "I've spent my entire life at work not to drop H-2A rocket ... All I can say is I'm so relieved."

 

For years now, U.S. police departments have employed officers who are trained to be experts in detecting "drugged driving." The problem is, however, that the methods those officers use are not based on science, according to a new editorial in the Journal of Studies on Alcohol and Drugs (JSAD).

With marijuana now legal in many U.S. states, the need for reliable tests for marijuana impairment is more pressing than ever. Police can evaluate alcohol-intoxicated drivers by using an objective measure of breath alcohol results. But there is no "breathalyzer" equivalent for marijuana. The drug is metabolized differently from alcohol, and a person's blood levels of THC (the main intoxicating chemical in marijuana) do not correlate with impairment.

So law enforcement relies on subjective tactics—roadside tests and additional evaluations by police officers specially trained to be so-called drug recognition experts (DREs). These officers follow a standardized protocol that is said to detect drug impairment and is said to even determine the specific drug type, including marijuana.

The process involves numerous steps, including tests of physical coordination; checking the driver's blood pressure and pulse; squeezing the driver's limbs to determine if the muscle tone is "normal" or not; and examining pupil size and eye movements.

But while the protocol has the trappings of a scientific approach, it is not actually based on evidence that it works, said perspective author William J. McNichol, J.D., an adjunct professor at Rutgers University Camden School of Law.

 

cross-posted from: https://lemmy.ca/post/46641802

 

cross-posted from: https://lemmy.ca/post/45858179

 

Used a 12 inch bit. It's a great workout, but really sucked when we encountered tree roots with it. Tomorrow, I'm going to set some posts in concrete using the holes.

 
 
 

The number and diversity of insects is declining worldwide. Some studies suggest that their biomass has almost halved since the 1970s. Among the main reasons for this are habitat loss—for example through agriculture or urbanization—and climate change.

These threats have long been known. What is less well-known is how these global change drivers interact and how their effects can become even more severe that way. For example, insects that have been deprived of their natural habitat could be even more affected by higher temperatures in a new environment.

Researchers at Julius-Maximilians-Universität Würzburg (JMU) have investigated precisely this serious interaction at 179 locations throughout Bavaria. The study is part of the LandKlif research cluster, coordinated by Professor Ingolf Steffan-Dewenter within the Bavarian Climate Research Network bayklif.

They published their results in the journal Proceedings of the Royal Society B: Biological Sciences.

 

In 2019, Tesla set out to lower insurance rates for owners of its electric cars. The goal was simple, at least in theory: fix the broken cost of car insurance. Instead, Tesla may have broken its own calculator trying to make sense of repair costs.

See, Musk's vision of Tesla's insurance product was that traditional companies just didn't "get it." Tesla's data claims that its Full Self-Driving software has fewer accidents than a human driver. Plus, its cars are rolling computers that can collect copious amounts of data on its drivers and adjust risk based on their driving. So why wouldn't drivers get a lower rate for putting around with FSD enabled if they also happen to be a safe driver? Tesla quickly found out that despite these assumptions, it's still taking a bath on claim-related losses.

The data comes from S&P Global and shows that the automaker's insurance subsidiary took a loss ratio of 103.3 in 2024. The loss ratio, for those who don't know, is the amount of money that Tesla pays out per claim versus the money it takes in from premiums. The lower the number, the better, and break-even is a flat 100. In 2024, the rest of the industry averaged 66.1.

Archive link: https://archive.is/G4Kvj

 

In 2019, Tesla set out to lower insurance rates for owners of its electric cars. The goal was simple, at least in theory: fix the broken cost of car insurance. Instead, Tesla may have broken its own calculator trying to make sense of repair costs.

See, Musk's vision of Tesla's insurance product was that traditional companies just didn't "get it." Tesla's data claims that its Full Self-Driving software has fewer accidents than a human driver. Plus, its cars are rolling computers that can collect copious amounts of data on its drivers and adjust risk based on their driving. So why wouldn't drivers get a lower rate for putting around with FSD enabled if they also happen to be a safe driver? Tesla quickly found out that despite these assumptions, it's still taking a bath on claim-related losses.

The data comes from S&P Global and shows that the automaker's insurance subsidiary took a loss ratio of 103.3 in 2024. The loss ratio, for those who don't know, is the amount of money that Tesla pays out per claim versus the money it takes in from premiums. The lower the number, the better, and break-even is a flat 100. In 2024, the rest of the industry averaged 66.1.

Archive link: https://archive.is/G4Kvj

 

The Federal Trade Commission has delayed the start of a rule that aims to make the process of canceling subscriptions less of a nightmare. Last year, the FTC voted to ratify amendments to a regulation known as the Negative Option Rule, adding a new "click-to-cancel" rule that requires companies to be upfront about the terms of subscription signups and prohibits them "from making it any more difficult for consumers to cancel than it was to sign up." Surprising no one, telecom companies were not happy, and sued the FTC. While the rule was nevertheless set to be implemented on May 14, the FTC now says enforcement has been pushed back 60 days to July 14.

Some parts of the updated Negative Option Rule went into effect on January 19, but the enforcement of certain provisions were deferred to May 14 by the previous administration to give companies more time to comply. Under the new administration, the FTC says it has "conducted a fresh assessment of the burdens that forcing compliance by this date would impose" and decided it "insufficiently accounted for the complexity of compliance."

Once the July 14 deadline hits, the FTC says "regulated entities must be in compliance with the whole of the Rule because the Commission will begin enforcing it." But, the statement adds, "if that enforcement experience exposes problems with the Rule, the Commission is open to amending" it.

Archive link: https://archive.is/7XDVE

view more: next ›