BorgDrone
@BorgDrone@lemmy.one
- Comment on Windows 11 is now automatically enabling OneDrive folder backup without asking permission 2 days ago:
If it’s a machine used for business: corporate espionage.
- Comment on Even Apple finally admits that 8GB RAM isn't enough 3 days ago:
Yes, there are massive advantages. It’s basically what makes unified memory possible on modern Macs. Especially with all the interest in AI nowadays, you really don’t want a machine with a discrete GPU/VRAM, a discrete NPU, etc.
Take for example a modern high-end PC with an RTX 4090. Those only have 24GB VRAM and that VRAM is only accessible through the (relatively slow) PCIe bus. AI models can get really big, and 24GB can be too little for the bigger models. You can spec an M2 Ultra with 192GB RAM and almost all of it is accessible by the GPU directly. Even better, the GPU can access that without any need for copying data back and forth over the PCIe bus, so literally 0 overhead.
The advantages of this multiply when you have more dedicated silicon. For example: if you have an NPU, that can use the same memory pool and access the same shared data as the CPU and GPU with no overhead. The M series also have dedicated video encoder/decoder hardware, which again can access the unified memory with zero overhead.
For example: you could have an application that replaces the background on a video using AI. It takes a video, decompresses it using the video decoder , the decompressed video frames are immediately available to all other components. The GPU can then be used to pre-process the frames, the NPU can use the processed frames as input to some AI model and generate a new frame and the video encoder can immediately access that result and compress it into a new video file.
The overhead of just copying data for such an operation on a system with non-unified memory would be huge. That’s why I think that the AI revolution is going to be one of the driving factors in killing systems with non-unified memory architectures, at least for end-user devices.
- Comment on Why not serve fried chicken on Juneteenth? How is it different from serving corned beef on St. Patrick’s day? 1 week ago:
So basically it became a stereotype because black people knew how to have a good time and throw a party with lots of guests and delicious food?
- Comment on Why not serve fried chicken on Juneteenth? How is it different from serving corned beef on St. Patrick’s day? 1 week ago:
They would take some elements of black culture, like (…) saying they love fried chicken and watermelon
How did this become a stereotype? Doesn’t everyone love fried chicken and watermelon regardless of skin color? They are both delicious.
- Comment on Masahiro Sakurai refused to add Dolby Surround to a Kirby game because players had to sit through the logo 5 weeks ago:
Sound is at least as important to the experience as the picture. Go watch a scary movie with the sound muted and you’ll notice it’s not scary at all.
Playing a game or watching a movie with just 2.0 audio, or worse: using the TV’s built-in speakers, is such a diminished experience that you might as well not bother.
- Comment on Masahiro Sakurai refused to add Dolby Surround to a Kirby game because players had to sit through the logo 5 weeks ago:
Without at least 5.1, why even bother playing games or watching movies?
- Comment on Masahiro Sakurai refused to add Dolby Surround to a Kirby game because players had to sit through the logo 5 weeks ago:
So because some people have a crappy home theater setup everyone should have a crappy experience?
- Comment on The Mac vs. PC war is back on? 5 weeks ago:
Simple economies of scale. They are expensive to produce because they don’t make a lot of them. The intended audience for the monitor it goes with doesn’t need a stand, and that monitor is a niche product to begin with. Neither is meant for the consumer market to begin with and the monitor, even with stand, is cheaper than many of the alternatives.
- Comment on The Mac vs. PC war is back on? 5 weeks ago:
You are confusing ‘costs a lot of money’ with overpriced.
Yes, Apple hardware costs a lot of money, but you do get what you pay for.
My current MacBook Pro (M1 Max, 64GB RAM) is simply the best machine I’ve ever used. It’s a no-compromised laptop. It’s fast, chews through everything I throw at it (which is a lot, I use it as a development machine). It never slow down, it never gets hot, I haven’t heard the fan run ever (not sure if it is just that silent or it simply never needs to turn on). The screen is amazing. The trackpad is amazing. The sound is amazing. The build quality is rock solid. The battery life is insane. I plug in a single thunderbolt cable and it charges my machine, connects to gbit ethernet, my audio system and drives 2 high-res monitors (5k2k and 4k).
Every time PC people claim they can get a ‘better computer’ for less it’s always some compromise. “This one has a much faster GPU and is cheaper”, sure, it also weights 8 kilos and runs for 20 minute on a full charge, is made of cheap plastic, has a screen with terrible viewing angles a crappy trackpad and sounds like a fighter jet with full afterburners on every time you put a little load on the system.
- Comment on Is Your Phone Listening to You? | NOVA 1 month ago:
There is also the practicality angle. If apps were listening in on all the random bullshit conversations people have, that would be such an unbelievable crapton of data to sift through, it would simply be uneconomical even if possible, just to show you an ad for cat food that will pay out like one cent IF someone clicked on it?
As for the lab grown diamonds thing, there is a real possibility that it went exactly the other way around. The ads didn’t get shown because they talked about it, but they talked about is because of the ads. We see ads all the time to the point we’re no longer consciously aware of them. Obviously, they still influence our behavior or companies wouldn’t spend a fortune on them. So a lab grown diamond company is running an ad campaign on FB. Someone sees that ad and it doesn’t consciously register, but it plants the idea of lab grown diamonds in their head. Then this causes them to bring it up in a conversation later. Now consciously aware of the concept, you suddenly notice the ad you ignored earlier.
IMO, this is a much more realistic and even scarier scenario than apps listening in. It’s apps manipulating your unconscious thoughts.
- Comment on How to opt out of the privacy nightmare that comes with new Hondas 1 month ago:
LOLWUT, I only buy cars that old or older. Why would I spend an absolute fortune on a new-ish car that I barely use anyway when I can get a perfectly reliable older car fir a fraction of the price?
- Comment on Novel attack against virtually all VPN apps neuters their entire purpose 1 month ago:
(…) the entire purpose and selling point of VPNs, which is to encapsulate incoming and outgoing Internet traffic in an encrypted tunnel and to cloak the user’s IP address.
No. That is not the entire point of a VPN. That’s just what a few shady companies are claiming to scam uninformed users into paying for a useless service. The entire point of a VPN is to join a private network (i.e. a network that is not part of the Internet) over the public internet, such as connecting to your company network from home. Hence the name ‘virtual private network’.
There are very little, if any, benefits to using a VPN service to browse the public internet.
- Comment on Dating apps are as if someone turned the job application experience into a pastime activity 1 month ago:
For me it’s the exact opposite. Job applications are the closest I’ll ever get to experience what it must be like to be a woman on online dating.
- Comment on Because of smartphones, pocket TVs were never a thing. 1 month ago:
We have more OTA channels, you just have to pay for them. The free channels are in shitty SD quality (you have to pay for HD) and they are only unencrypted because the government requires it (as they are used for emergency broadcasts).
- Comment on Because of smartphones, pocket TVs were never a thing. 1 month ago:
Here you get a grand total of three shitty channels for free OTA. Anything more requires a subscription.
- Comment on Because of smartphones, pocket TVs were never a thing. 1 month ago:
Which has significantly worse picture quality than cable or fiber, has fewer channels and isn’t even significantly cheaper
- Comment on Because of smartphones, pocket TVs were never a thing. 1 month ago:
So? Not sure why the difference matters. What is even the use or a tuner anymore?
- Comment on Because of smartphones, pocket TVs were never a thing. 1 month ago:
Your point?
- Comment on Because of smartphones, pocket TVs were never a thing. 1 month ago:
Do they?
I can watch my local TV channels from the other side of the planet. I don’t think the signal reaches that far.
- Comment on Because of smartphones, pocket TVs were never a thing. 1 month ago:
Also, my TV provider’s app allows me to watch live TV on my phone.
- Comment on The future of mobile devices? 2 months ago:
My quote explicitly ties wired and wireless charging together into the document.
It just says they are keeping an eye on developments in both wireless and wired charging standards.
- Comment on The future of mobile devices? 2 months ago:
Given that they’re focused on reduction of waste and reduction of market fragmentation there’s definitely a question mark over only QI based charging.
I don’t see how you can get that from the text. The way I read it, wired and wireless charging are separate. There is nothing in the directive that mandates one should be used over the other. They explicitly require USB-C for wired charging, but do not put any requirements on wireless, as there doesn’t seem to be any significant fragmentation on the wireless side (i.e. no need to enforce a standard if everyone already agrees on a standard).
- Comment on The future of mobile devices? 2 months ago:
Nope.
Look at the actual directive , not some press release. Note that this is an older directive, but the common charger directive only describes the changes that need to be made to the earlier directive. The first link is to the latest updated version of that directive.
I quote (emphasis mine):
In so far as they are capable of being recharged by means of wired charging, the categories or classes of radio equipment referred to in point 1 of this Part shall:
2.1. be equipped with the USB Type-C receptacle, as described in the standard (…)
2.2. be capable of being charged with cables which comply with the standard (…)
At the moment the directive does not prescribe a universal standard for wireless charging, but does reserve the right to do so in the future. (At the moment it doesn’t seen necessary as everyone seems to be adopting QI)
- Comment on The future of mobile devices? 2 months ago:
The EU does not require a charging port, it only says that if you have a charging port it must be USB-C.
- Comment on The first Apple-approved emulators for the iPhone have arrived 2 months ago:
Also for anything UI related. You want to test how it actually feels to use, e.g. if you can reach the UI elements with one hand. Using it with a mouse on a monitor just doesn’t give you a good sense of that. Especially if your UI involves gestures.
- Comment on The first Apple-approved emulators for the iPhone have arrived 2 months ago:
Yes, that’s what I mean. It’s a simulator, not an emulator. It does not work exactly like a real device. For simple stuff, sure, but if you dive below the surface even a little it’s very different.
One example is anything to do with the GPU / Metal. It has a very different set of capabilities and limitations than actual iOS hardware.
- Comment on The first Apple-approved emulators for the iPhone have arrived 2 months ago:
Apple already has had emulators for iOS for years, it’s how most devs do mobile development.
AFAIK Apple does not release an iPhone emulator to the public. There is one third party emulator I’m aware of but that’s mainly intended for security research and not general development.
it’s way nicer than running on an actual iPhone or iPad (I don’t have either anyway).
Hard disagree.
- Comment on The first Apple-approved emulators for the iPhone have arrived 2 months ago:
Apple provides an iPhone emulator as part of their official SDK.
No they don’t.
- Comment on Apple argues in favor of selling Macs with only 8GB of RAM 2 months ago:
Limited RAM, even a 4090 only has 24GB and slow transfers to/from VRAM. The GPU can only operate on data in VRAM, so anything you need it to work on you need to copy over the relatively slow PCIe bus to the GPU. Then once it’s done you need to copy the results back over the PCIe bus to system RAM for the CPU to be able to access it. This considerably slows down GPGPU tasks.
- Comment on Apple argues in favor of selling Macs with only 8GB of RAM 2 months ago:
“unified memory” is an Apple marketing term for what everyone’s been doing for well over a decade.
Wrong. Unified memory (UMA) is not an Apple marketing term, it’s a description of a computer architecture that has been in use since at least the 1970’s. For example, game consoles have always used UMA.
Every single integrated GPU in existence shares memory between the CPU and GPU; that’s how they work.
Again, wrong.
While iGPUs have existed for PCs for a long time, they did not use a unified memory architecture. What they did was reserve a portion of the system RAM for the GPU. For example on a PC with 512MB RAM and an iGPU, 64MB may have been reserved for the GPU. The CPU then had access to 512-64 = 448MB. While they shared the same physical memory chips, they both had a separate address space. If you wanted to make a texture available to the GPU, it still had to be copied to the special reserved RAM space for the GPU and the CPU could not access that directly.
With unified memory, both CPU and GPU share the same address space. Both can access the entire memory. No RAM is reserved purely for the GPU. If you want to make something available to the GPU, nothing needs to be copied, you just need to point to where it is in RAM. Likewise, anything done by the GPU is immediately accessible by the CPU.
Since there is one memory pool for both, you can use RAM more efficiently. If you have a discrete GPU with 16GB VRAM, and your app only needs 8GB VRAM, that other memory just sits there being useless. Alternatively, if your app needs 24GB VRAM, you can’t run it because your GPU only has 16B, even if you have lots of system RAM available.
With UMA you can use all the RAM you have for whatever you need it for. On an M2 Ultra with 192GB RAM you can use almost all of that for the GPU (minus a little bit that’s used for the OS and any running apps). Even on a tricked out PC with a 4090 you can’t run anything that needs more than 24GB VRAM. Want to run something where the GPU needs 180MB of memory? No problem on an M1 Ultra.
It has nothing to do with soldering the RAM.
It has everything to do with soldering the RAM. One of the reason iGPUs sucked, other than not using UMA, is that GPUs performance is almost limited by memory bandwidth. Compared to VRAM, standard system RAM has much, much less bandwidth causing iGPUs to be slow.
A high-bandwidth memory bus, like a GPU needs, has a lot of connections and runs at high speeds. The only way to do this reliably is to physically place the RAM very close to the actual GPU. Why do you think GPUs do not have user-upgradable RAM?
Soldering the RAM makes it possible to integrate a CPU and an non-sucking GPU. Go look at the inside of a PS5 or XSX and you’ll see the same thing: an APU with the RAM chips soldered to the board very close to it.
This again has little to do with being socketed though: LPCAMM supports up to 9.6GT/s, considerably faster than what ships with the latest macs.
LPCAMM is a very recent innovation. Engineering samples weren’t available until late last year and the first products will only hit the market later this year. Maybe this will allow for Macs with user-upgradable RAM in the future.
The only way discrete GPUs can possibly be outcompeted is if DDR starts competing with GDDR and/or HBM in terms of bandwidth
What use is high bandwidth memory if it’s a discrete memory pool with only a super slow PCIe bus to access it?
Discrete VRAM is only really useful for gaming, where you can upload all the assets to VRAM in advance and data practically only flows from CPU to GPU and very little in the opposite direction. Games don’t matter to the majority of users. GPGPU is much more interesting to the general public.