I’m highlighting that having the data is not enough, if you don’t find a good way to use the data to sort the trash out. Google will need to do it, not Reddit; Reddit is only handing the data over.
Is this clear now? If you’re still struggling to understand, refer to the context provided by the comment chain, including your own comments.
I’m saying reddit will not ship a trashed deliverable. Guaranteed.
Reddit will have already preprocessed for this type of data damage. This is basic data engineering and trivial to do to find events in the data and understanding timeseries of events.
Google will be receiving data that is uncorrupted, because they’ll get data properly versioned to before the damaging event.
I’m saying reddit will not ship a trashed deliverable. Guaranteed.
Nobody said anything about the database being trashed. What I’m saying is that the database is expected to have data unfit for LLM training, that Google will need to sort out, and Reddit won’t do it for Google.
Reddit will have already preprocessed for this type of data damage.
Do you know it, or are you assuming it?
If you know it, source it.
If you’re assuming, stop wasting my time with shit that you make up.
I know it because I’ve worked in corporate data migrations and it would be abnormal to do anything else. there’s a full review of test data, a scope of work, an acceptance period
You think reddit doesn’t know about these utilities? You think Google doesn’t?
You need to chill out and acknowledge how an industry works. I’m sure but your idea of things isn’t how the industry works.
I don’t need to explain to you that the sky is blue. And I shouldn’t need to explain to you that Google isn’t going to accept a damaged product, and that reddit cant do some basic querying and timeseries manipulations.
lvxferre@mander.xyz 8 months ago
I’m highlighting that having the data is not enough, if you don’t find a good way to use the data to sort the trash out. Google will need to do it, not Reddit; Reddit is only handing the data over.
Is this clear now? If you’re still struggling to understand, refer to the context provided by the comment chain, including your own comments.
GBU_28@lemm.ee 8 months ago
I’m saying reddit will not ship a trashed deliverable. Guaranteed.
Reddit will have already preprocessed for this type of data damage. This is basic data engineering and trivial to do to find events in the data and understanding timeseries of events.
Google will be receiving data that is uncorrupted, because they’ll get data properly versioned to before the damaging event.
lvxferre@mander.xyz 8 months ago
Nobody said anything about the database being trashed. What I’m saying is that the database is expected to have data unfit for LLM training, that Google will need to sort out, and Reddit won’t do it for Google.
Do you know it, or are you assuming it?
If you know it, source it.
If you’re assuming, stop wasting my time with shit that you make up.
GBU_28@lemm.ee 8 months ago
I know it because I’ve worked in corporate data migrations and it would be abnormal to do anything else. there’s a full review of test data, a scope of work, an acceptance period
You think reddit doesn’t know about these utilities? You think Google doesn’t?
You need to chill out and acknowledge how an industry works. I’m sure but your idea of things isn’t how the industry works.
I don’t need to explain to you that the sky is blue. And I shouldn’t need to explain to you that Google isn’t going to accept a damaged product, and that reddit cant do some basic querying and timeseries manipulations.