GamingChairModel
@GamingChairModel@lemmy.world
- Comment on What an unprocessed photo looks like 4 days ago:
Even the human eye basically follows the same principle. We have 3 types of cones, each sensitive to different portions of wavelength, and our visual cortex combines each cone cell’s single-dimensional inputs representing the intensity of light hitting that cell in its sensitivity range, from both eyes, plus the information from the color-blind rods, into a seamless single image.
- Comment on What an unprocessed photo looks like 4 days ago:
This write-up is really, really good. I think about these concepts whenever people discuss astrophotography or other computation-heavy photography as being fake software generated images, when the reality of translating the sensor data with a graphical representation for the human eye (and all the quirks of human vision, especially around brightness and color) needs conscious decisions on how those charges or voltages on a sensor should be translated into a pixel on digital file.
- Comment on The dominoes are falling: motherboard sales down 50% as PC enthusiasts are put off by stinking memory prices 1 week ago:
Do MSI and ASUS have enough corporate/enterprise sales to offset the loss of consumer demand? With the RAM companies the consumer crunch is caused by AI companies bidding up the price of raw memory silicon well beyond what makes financial sense to package and solder onto DIMMs (or even directly solder the packages onto boards for ultra thin laptops).
- Comment on China Has Reportedly Built Its First EUV Machine Prototype, Marking a Semiconductor Breakthrough the U.S. Has Feared All Along 2 weeks ago:
Cutting edge chip making is several different processes all stacked together. The nations that are roughly aligned with the western capitalist order have split up responsibilities across many, many different parts of this, among many different companies with global presence.
The fabrication itself needs to tie together several different processes controlled by different companies. TSMC in Taiwan is the current dominant fab company, but it’s not like there isn’t a wave of companies closely behind them (Intel in the US, Samsung in South Korea).
There’s the chip design itself. Nvidia, Intel, AMD, Apple, Qualcomm, Samsung, and a bunch of other ARM licensees are designing chips, sometimes with the help of ARM itself. Many of these leaders are still American companies developing the design in American offices. ARM is British. Samsung is South Korean.
Then there’s the actual equipment used in the fabs. The Dutch company ASML is the most famous, as they have a huge lead on the competition in manufacturing photolithography machines (although old Japanese competitors like Nikon and Canon want to get back in the game). But there are a lot of other companies specializing in specific equipment found in those labs. The Japanese company Tokyo Electron and the American companies Applied Materials and Lam Research, are in almost every fab in the West.
Once the silicon is fabricated, the actual packaging of that silicon into the little black packages to be soldered onto boards is a bunch of other steps with different companies specializing in different processes relevant to that.
Plus advanced logic chips aren’t the only type of chips out there. There are analog or signal processing chips, or power chips, or other useful sensor chips for embedded applications, where companies like Texas Instruments dominate on less cutting edge nodes, and memory/storage chips, where the market is dominated by 3 companies, South Korean Samsung and SK Hynix, and American company Micron.
TSMC is only one of several, standing on a tightly integrated ecosystem that it depends on. It also isn’t limited to only being located in Taiwan, as they own fabs that are starting production in the US, Japan, and Germany.
China is working at trying to replace literally every part of the chain in domestic manufacturing. Some parts are easier than others to replace, but trying to insource the whole thing is going to be expensive, inefficient, and risky. Time will tell whether those costs and risks are worth it, but there’s by no means a guarantee that they can succeed.
- Comment on China Has Reportedly Built Its First EUV Machine Prototype, Marking a Semiconductor Breakthrough the U.S. Has Feared All Along 2 weeks ago:
No, X-rays are too energetic.
Photolithography is basically shining some kind of electromagnetic radiation through a stencil so that specific lines are etched into the top “photoresist” layer of a silicon wafer. The radiation causes a chemical change wherever a photon hits, so that stencil blocks the photons in a particular pattern.
Photons are subject to interference from other photons (and even itself) based on wavelength, so smaller wavelengths (which are higher energy) can fit into smaller and finer feature size, which ultimately means smaller transistors where more can fit in any given area of silicon.
But once the energy gets too high, as with X-ray photons, there’s a secondary effect that ruins things. The photons have too much leftover energy even after hitting the photoresist to be etched, and it causes excited electrons to cause their own radiation where high energy photons start bouncing around underneath, and then the resulting boundaries between the photoresist that has been exposed to radiation and the stuff that hasn’t becomes blurry and fuzzy, which wrecks the fine detail.
So much of the 20 years leading up to commercialized EUV machines has been about finding the perfect wavelength optimized for feature size, between wavelengths small enough to make really fine details and energy levels low enough not to cause secondary reactions.
- Comment on Tesla Robotaxis Are Crashing More Than 12 Times as Frequently as Human Drivers 2 weeks ago:
2 lanes in each direction with a middle lane? That’s a big chunk of Texas, especially when weighted for population.
- Comment on Tesla Robotaxis Are Crashing More Than 12 Times as Frequently as Human Drivers 2 weeks ago:
How will it reduce demand for parking? Do you envision the car will drop someone off and then drive away until it finds a parking spot that’s farther than the person would want to walk?
Plenty of high demand areas use human valet parkers for this issue. The driver drops off their car at the curbside destination, and then valets take the vehicle and park it in a designated area that saves the car driver some walking.
Then, the valet parking area in dense areas has tighter parking where cars are allowed to block in others. As a result, the same amount of paved parking spot can accommodate more cars. That’s why in a lot of dense cities, garages with attendants you leave keys with are cheaper than self-park garages.
Automated parking can therefore achieve higher utilization of the actual paved parking areas, a little bit away from the actual high pedestrian areas, in the same way that human valet parking already does today in dense walkable neighborhoods.
and people wouldn’t be happy waiting 5-10 minutes for their car to navigate back to them.
As with the comparison to valets, it’s basically a solved problem where people already do put up with this by calling ahead and making sure the car is ready for them at the time they anticipate needing it.
Once again reinventing buses and trains
Yes! And trains are very efficient. Even when cargo is containerized, where a particular shipping container may go from truck to train to ship, each individual containerized unit will want to take advantage of the scale between major hubs while still having the flexibility to make it between a specific origin and destination between the spokes. The container essentially hitches a ride with a larger, more efficient high volume transport for part of its journey, and breaks off from the pack for the portions where shared routing no longer make sense.
- Comment on Tesla Robotaxis Are Crashing More Than 12 Times as Frequently as Human Drivers 2 weeks ago:
The default in most other states is that opposite direction traffic on a divided highway don’t have to stop. The states differ in what constitutes a divided highway, but generally at least 5 feet of space or a physical barrier between the lanes would qualify. In Texas, however, there is no exception for divided highways, and the key definition is “controlled-access highway,” which requires on/off ramps and physical barriers between traffic directions, or “different roadways,”
So for a 5-lane road where there are 2 lanes going in each direction with a center lane for left turns, Texas requires opposite direction traffic to stop, while most other states do not.
- Comment on Tesla Robotaxis Are Crashing More Than 12 Times as Frequently as Human Drivers 2 weeks ago:
Waymos were violating a Texas state law that requires cars to stop when a school bus stops, even in 2+ lane roads separated by a paved median, even for traffic going in the opposite direction:
liggettlawgroup.com/…/School-bus-laws-img-1024x65…
The requirements for opposite side traffic in multi-lane roads is pretty rare and might be unique to Texas. And yes, human drivers fuck this up all the time, too, leading to a lot of PSAs in Texas, especially for new residents.
- Comment on Tesla Robotaxis Are Crashing More Than 12 Times as Frequently as Human Drivers 2 weeks ago:
It’s bizarre how if you drove through twenty bus stops in three days, you would not only lose your license but be in jail on multiple charges.
This is a relatively unique Texas law that requires cars to stop when school buses are loading or unloading passengers, including on the opposite side of the road going the other direction. The self driving companies didn’t program for that special use case, so it actually is a relatively easy fix in software.
And the human drivers who move to Texas often get tripped up by this law, because many aren’t aware of the requirement.
- Comment on Tesla Robotaxis Are Crashing More Than 12 Times as Frequently as Human Drivers 2 weeks ago:
Paradoxically, the large scale deployment of self driving cars will improve the walkability of neighborhoods by reducing the demand for parking.
One can also envision building on self driving tech to electronically couple closely spaced cars so that more passengers can fit in a given area, such that throughout of passenger miles per hour can increase several times over. Cars could tailgate like virtual train cars following each other at highway speeds with very little separation, lanes could be narrowed to fit more cars side by side in traffic, etc.
- Comment on Tesla Robotaxis Are Crashing More Than 12 Times as Frequently as Human Drivers 2 weeks ago:
Most importantly, the projections of fusion being 30 years away depended on assumptions about funding, when political considerations made it so that we basically never came anywhere close to those assumptions:
…wikimedia.org/…/File:U.S._historical_fusion_budg…
Fusion was never vaporware. We had developed working weapons relying on nuclear fusion in the 1950’s. Obviously using a full blown fission reaction to “ignite” the fusion reaction was never going to be practical, but the core physical principles were always known, with the need for the engineering and materials science to catch up with alternative methods of igniting and harvesting the energy from those fusion reactions.
But we never really devoted the resources to figuring it out. Only more recently has there been significant renewed interest in funding the research to make it possible, and as you note, many different projects are hitting different milestones on the frontier of that research.
- Comment on This long-term data storage will last 14 billion years 2 weeks ago:
Writing 360 TB at 4 MB/s will take over 1000 days, almost 3 years. Retrieving 360 TB at a rate of 30 MB/s is about 138 days. That capacity to bitrate ratio that is going to be really hard to use in a practical way, and it’ll be critical to get that speed up. Their target of 500 MB/s is still more than 8 days to read or write the data from one storage platter.
- Comment on This long-term data storage will last 14 billion years 2 weeks ago:
I would argue, and I’m sure many historians and librarians and archivists would agree, that “general data backups” are essential human data. Storing the data allows for later analysis, which may provide important insights. Even things that seem trivial and unimportant today can provide very important insights later.
- Comment on This long-term data storage will last 14 billion years 2 weeks ago:
Honda won’t honor my 10-year powertrain warranty just because I yeeted my 2-year-old Civic off a bridge into salt water!
- Comment on Trains cancelled over fake bridge collapse image 3 weeks ago:
I don’t think it’d be that simple.
Any given website URL could go viral at any moment. In the old days, that might look like a DDoS that brings down the site (aka the slashdot effect or hug of death), but these days many small sites are hosted on infrastructure that is protected against unexpectedly high traffic.
So if someone hosts deceptive content on their server and it can be viewed by billions, there would be a disconnect between a website’s reach and its accountability (to paraphrase Spider-Man’s Uncle Ben).
- Comment on Earth needs more energy. Atlanta’s Super Soaker creator may have a solution. 3 weeks ago:
The company describes this generator as a solid state device, but the diagrams show the reliance on fluid/flow of hydrogen between the hot side and the cold side for moving some protons around. That seems to be something in between the semiconductor-based solid state thermoelectric generators that are already commonly understood and some kind of generator with moving solid parts.
It still seems like a low maintenance solution to have a closed loop of hydrogen, but that seems like a potential maintenance/failure point, as well, to rely on the chamber to remain filled with hydrogen gas.
- Comment on Earth needs more energy. Atlanta’s Super Soaker creator may have a solution. 3 weeks ago:
The inventor/founder at the center of the article, Lonnie Johnson, was on the team at JPL that designed and implemented the thermoelectric generators (heated by radioactive decay from plutonium-238 pellets) on the Galileo spacecraft sent to Jupiter. So I would expect that he’s more familiar with the thermodynamic and engineering challenges than even a typical expert.
The PR fluff put out by the company mentions that the theoretical basis for this specific type of generator was worked out a while ago but needed materials science to advance to the point where this type of generator can be thermodynamically and commercially feasible.
Looking at how this generator is supposed to work, it’s interesting in that it does rely on the movement of fluid, but is supposed to be a totally closed loop, to be a bit different than the pure solid state, semiconductor-based Seebeck generators that are already well known.
The other area talked about in this article is that they believe that it can be effective with lower temperature differentials than any previous technology, which might make a huge difference in whether it can be deployed to more useful places and thereby make it economically feasible more easily than prior concepts.
In the end, if these generators can output some electric voltage/current, it might just take on similar generation characteristics as photovoltaics, which could mean that hooking these up to the grid could draw on some of the lessons learned from the rise of grid scale solar.
- Comment on When the AI bubble bursts.. 3 weeks ago:
Specifically, desktop RAM is slabs of silicon, placed into little packages, soldered onto circuit boards in DIMM form or similar, to be plugged into a motherboard slot for RAM.
The AI demand is for the silicon itself, using advanced packaging techniques to put them on the same package as the complex GPUs with very high bandwidth. So these same pieces of silicon are not even being put into DIMMs, so that if they fall out of use they’ll be pretty much intertwined with chips in form factors that a consumer can’t easily make use of.
There’s not really an easy way to bring that memory back into the consumer market, even after the AI bubble bursts.
- Comment on Is Pixelfed sawing off the branch that the Fediverse is sitting on? 4 weeks ago:
More along the lines of a “pizza finder” service that scours different menus and shows the pizza options at a bunch of places, whether those places exclusively offer pizza, specialize in pizza with some other options, or just offer pizza as one of several options. It would be perfectly reasonable for such a service to only return results related to pizza, without any implicit suggestion that each place it returns only has pizza available.
- Comment on The Algorithm That Detected a $610 Billion Fraud: How Machine Intelligence Exposed the AI Industry’s Circular Financing Scheme 4 weeks ago:
Financial analysts were sounding the alarm in October. On October 7, Bloomberg ran an influential article about the circular deals:
bloomberg.com/…/openai-s-nvidia-amd-deals-boost-1…
That built on earlier reporting where they described the deals as circular, as the deals were being announced. Each of these reports notes the financial analysts at different investment firms sounding the alarm.
From there, a robust discussion happened all over the financial press about whether these circular deals were truly unstable. By the time Gamers Nexus ran that video the financial press was already kinda getting sick of the story.
Whatever the hell these trading algorithms were doing on November 20, they definitely weren’t ahead of the curve on investor knowledge and belief.
- Comment on Crucial is shutting down — because Micron wants to sell its RAM and SSDs to AI companies instead 4 weeks ago:
it’s like AI companies went from buying 10 memories as usual to 1000000,
I mean, they basically did. OpenAI announced a few months ago that they reached deals to buy 900,000 wafers of DRAM per month, representing 40% of global production capacity. For a single company. There are several other companies competing at those scales, too.
- Comment on Seems legit 4 weeks ago:
Plenty of the AI functions on phones are on-device. I know the iPhone is capable of several text-based processing (summarizing, translating) offline, and they have an API for third party developers to use on-device models. And the Pixels have Gemini Nano on-device for certain offline functions.
- Comment on Elon Musk’s Grok Goes Haywire, Boasts About Billionaire’s Pee-Drinking Skills and ‘Blowjob Prowess’ 1 month ago:
but I don’t think most are designed to control people’s opinions
Yeah I’m on team chaos theory. People can plan and design shit all they want, but the complexity will lead to unexpected behavior, always. How harmful that unwanted behavior is, or how easy it is to control or contain, is often unknown in advance, but invented things tend to develop far, far outside the initial vision of the creators.
- Comment on 3 months ago:
Being able to point a camera at something and have AI tell me “that’s a red bicycle” is a cool novelty the first few times, but I already knew that information just by looking at it.
Visual search is already useful. People go through the effort of posting requests to social media or forums asking “what is this thing” or “help me ID these shoes and where I can buy them” or “what kind of spider is this” all the time. They’re not searching for red bicycles, they’re taking pictures of a specific Bianchi model and asking what year it was manufactured. Automating the process and improving the reliability/accuracy of that search will improve day to day life.
And I have strong reservations about the fundamental issues of inference engines being used to generate things (LLMs and diffusers and things like that), but image recognition, speech to text, and translation are areas where these tools excel today.
- Comment on Are Cars Just Becoming Giant Smartphones on Wheels? 3 months ago:
The eyebrow raiser in the Slate’s base configuration is that it doesn’t come with any audio systems: no radio antenna/tuner, no speakers. It remains to be seen how upgradeable the base configuration is for audio, how involved of a task it will be to install speakers in the dash or doors, installing antennas (especially for AM, which are tricky for interference from EV systems), etc.
I’d imagine that most people would choose to spend few thousand on that audio upgrade up to the bare minimum expectations one would have for a new vehicle, so that cuts into the affordability of the package.
- Comment on Are Cars Just Becoming Giant Smartphones on Wheels? 3 months ago:
The analog dials were an illusion. That information has been processed digitally for at least the last 25 years.
- Comment on "Very dramatic shift" - Linus Tech Tips opens up about the channel's declining viewership 3 months ago:
What I’m saying is if YouTube is sharing $10 million of revenue with channel owners in a month that has 1,000,000,000 total views across YouTube, that’s a penny per view.
Then, if the next month the reconfigure the view counts to exclude certain bots or views under a particular number, you might see the overall view count drop from 1,000,000,000 to 500,000,000, while still hitting the same overall revenue. At that point, it’s $0.02 per view, so a channel that sees their view count drop in half may still see the same revenue despite the drop in view count.
If it’s a methodology change across all of YouTube, a channel that stays equally popular as a percentage of all views will see the revenue stay the same, even if the view counts drop (because every other channel is seeing their view counts drop, too).
- Comment on "Very dramatic shift" - Linus Tech Tips opens up about the channel's declining viewership 3 months ago:
Isn’t that the formula? They take all of the revenue, set aside the percentage they’ve set for revenue share, and then divide that among all channels based on viewer counts. Dropping viewership for all channels proportionally means that the same amount of revenue will still be distributed to the channels in the previous ratios.
- Comment on Big Surprise—Nobody Wants 8K TVs 3 months ago:
Most 4k streams are 8-20 Mbps. A UHD runs at 128 Mbps.
Bitrate is only one variable in overall perceived quality. There are all sorts of tricks that can significantly reduce file size (and thus bitrate of a stream) without a perceptible loss of quality. And somewhat counterintuitively, the compression tricks work a lot better on higher resolution source video, which is why each quadrupling in pixels (doubling height and width) doesn’t quadruple file size.
The codec matters (h.264 vs h.265/HEVC vs VP9 vs AV1), and so do the settings actually used to encode. Netflix famously is willing to spend a lot more computational power on encoding, because they have a relatively small number of videos and many, many users watching the same videos. In contrast, YouTube and Facebook don’t even bother re-encoding into a more efficient codec like AV1 until a video gets enough views that they think they can make up the cost of additional processing with the savings of lower bandwidth.
Video encoding is a very complex topic, and simple bitrate comparisons only barely scratch the surface in perceived quality.