this post was submitted on 07 Mar 2024
574 points (97.7% liked)
Technology
59605 readers
3094 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'm highlighting that having the data is not enough, if you don't find a good way to use the data to sort the trash out. Google will need to do it, not Reddit; Reddit is only handing the data over.
Is this clear now? If you're still struggling to understand it, refer to the context provided by the comment chain, including your own comments.
I'm saying reddit will not ship a trashed deliverable. Guaranteed.
Reddit will have already preprocessed for this type of data damage. This is basic data engineering and trivial to do to find events in the data and understanding timeseries of events.
Google will be receiving data that is uncorrupted, because they'll get data properly versioned to before the damaging event.
If a high edit event happens on March 7th, they'll ship march 7th - 1d. Guaranteed.
Edit to be clear: you're ignoring/not accepting the practice of noting high volume of edits per user as an event, and using that timestamped event as a signal of data validity.
Nobody said anything about the database being trashed. What I'm saying is that the database is expected to have data unfit for LLM training, that Google will need to sort out, and Reddit won't do it for Google.
Do you know it, or are you assuming it?
If you know it, source it.
If you're assuming, stop wasting my time with shit that you make up and your "huuuuh?" babble.
I know it because I've worked in corporate data engineering and large data migrations and it would be abnormal to do anything else. there's a full review of test data, a scope of work, an acceptance period, etc.
You think reddit doesn't know about these utilities? You think Google doesn't?
You need to chill out and acknowledge how an industry works. I'm sure you are convinced but your idea of things isn't how the industry works.
I don't need to explain to you that the sky is blue. And I shouldn't need to explain to you that Google isn't going to accept a damaged product, and that reddit can or can't do some basic querying and timeseries manipulations.
Edit like you literally asked for a textbook.
In other words: "I dun have sauce, I'm assooming, but chruuuust me lol"
At this rate it's safe to simply ignore your comments as noise. I'm not wasting further time with you.
Seems like people are voting your comment as noise but whatever.
You are trying to prove something normal ISNT happening. I'm describing normal industry behavior.
Seems like you need to prove an abnormal sitch is occuring.
Edit it's like your asking for proof that they'll build stairs with a hand rail