Aceticon
@Aceticon@lemmy.dbzer0.com
- Comment on butt mogged these zoomers today 1 day ago:
You’re off by at least 2 decades.
- Comment on butt mogged these zoomers today 1 day ago:
Is it just me whose reaction was “So?! What’s the big deal?”
Don’t get me wrong: good for her for having a big booty. It’s just that I don’t have any expectation of the girls in the picture being shocked, nor am I myself shocked or otherwise impressed by her crouching position emphasising said big booty.
For me the whole thing is a kinda normal teen/young-adult showoff pose, hence no big deal.
- Comment on Palworld confirms ‘disappointing’ game changes forced by Pokémon lawsuit 5 days ago:
Copyright if elements of the game such as 3D models, images and code have been copied.
Trademark if the name of the game is used (i.e. “Stardew Valley Romance Sims”).
Patents for game mechanics.
As a side note, personally I think that game mechanics shouldn’t be at all patentable
- Comment on i truly believe that there's an open war between Humanity vs. Advertisers and their allies. 5 days ago:
Any live TV - were I live they all show the same ads.
- Comment on i truly believe that there's an open war between Humanity vs. Advertisers and their allies. 5 days ago:
Actually I think it’s worse: human salesmen cost money whilst this shit is mainly automated or uses distribution systems were one person does the work and millions get exposed to it (for example TV), so the numbers involved and the relentlessness of the pestering is far, far larger in scale than if human salesmen were doing it.
- Comment on i truly believe that there's an open war between Humanity vs. Advertisers and their allies. 5 days ago:
Have a look at Perfume adverts on TV: they are literally entirely made up of imagery meant to make one think about sex and being sexy, with not a single thing in any of them about the actual quality of the product.
Car adverts too are similar, but their imagery is about things like Freedom, Family, Friendship, Party, Joy and so on (depending on the car). Almost none of them talks about the qualities of the actual vehicle.
Adverts not relying in this kind of psychological manipulation are the ones which look a lot like 1950s adverts and talk about the actual qualities of the product.
Under-investment into training advertising creatieves doesn’t mean that the adverts aren’t using Psychology tricks anymore because that way of doing adverts is now so widespread and common in the industry (because it works!) that people just learn those things as tricks of the trade rather than needing any kind of special extra training in Psychology.
- Comment on i truly believe that there's an open war between Humanity vs. Advertisers and their allies. 5 days ago:
With modern advertising techniques, adverts mainly work by psychological manipulation - putting a name in your mind, making you associate it with an emotion (for example cars are “freedom”, perfume is “lust”), induce fear of a non-existing problem and then sell you a “solution” and so on.
It’s like being the focus of a gaggle of salesmen who are slick manipulators with training in Psychology and with no ethics at all.
That’s how it is every day in every place (even the comfort of your own home) in the advert heavy world we live in if one doesn’t fight to keep that shit away.
- Comment on Chips aren’t improving like they used to, and it’s killing game console price cuts 1 week ago:
Well, this being the Internet it’s natural to expect less than impeccable truth from strangers here, both because a lot of people just want to feel like they “won” the argument no matter what so they’ll bullshit their way into a “win”, because most people aren’t really trained in the “trying to be as completed and clear as possible” mental processes as Engineers and Scientists (so there’s a lot of “I think this might be such” being passed as “it is such”) and because it simply feels bad to be wrong so most people don’t want to accept it when somebody else proves them wrong and react badly to it.
I’m actually a trained Electronics Engineer but since I don’t actually work in that domain and studied it decades ago, some of what I wrote are informed extrapolations based on what learned and stuff I read over the years rather than me being absolutely certain that’s how things are done nowadays (which is why looking up and reading that Intel spec was very interesting, even if it turned out things are mainly is as I expected).
Also I’m sorry for triggering you, you don’t need to say sorry for your reaction and I didn’t really took it badly: as I said, this is the Internet and a lot of people are argumentative for the sake of “winning” (probably the same motivation as most gaslighters) so I expect everybody to be suspicious of my motivations, same as they would be for all other people since from their point of view I’m just another random stranger ;)
Anyways, cheers for taking the trouble of explaining it and making sure I was okay with out interaction - that’s far nicer and more considerate than most random internet strangers.
- Comment on I knew one day I’d have to watch powerful men burn the world down. I just didn’t expect them to be such losers 1 week ago:
If they were posh people with “breeding” that wouldn’t make being the direct or indirect victim of their destruction any better, unless you’re an upper middle class Briton who has bern btought up with 19th century notions of “born to rule elites” such as most columnists for The Guardian.
I get it that people here derive a little enjoyment out of taking the piss of the character flaws of these wankers, but were they are coming from is not were this columnist is coming from.
- Comment on Chips aren’t improving like they used to, and it’s killing game console price cuts 1 week ago:
Well, I wasn’t sure if you meant that I did say that or if you just wanted an explanation, so I both clarified what I said and I gave an explanation to cover both possibilities :)
I think the person I was replying to just got confused when they wrote “integrated memory” since as I explained when main memory is “integrated” in systems like these, that just means it’s soldered on the motherboard, something which really makes no difference in terms of architecture.
There are processing units with integrated memory (pretty much all microcontrollers), which in means they come with their own RAM (generally both Flash Ram and SRAM) in the same integrated circuit package or even the same die, but that’s at the very opposite end of processing power of a PC or PS5 and the memory amounts involved tend to be very small (a few MB or less).
As for the “integrated graphics” bit, that’s actually the part that matters when it comes to performance of systems with dedicate CPU and GPU memory vs systems with shared memory (integrated in the motherboard or otherwise, since being soldered on the motherboard or coming as modules doesn’t really change the limitations of each architecture) which is what I was talking about all the way back in the original post.
- Comment on Chips aren’t improving like they used to, and it’s killing game console price cuts 1 week ago:
Hah, now you made me look that stuff up since I was talking anchored on my knowledge of systems with multiple CPUs and shared memory, since that was my expectation about the system architecture for the PS5, since in the past that’s how they did things.
So, for starters I never mentioned “integrated memory”, I wrote “integrated graphics”, i.e. the CPU chip comes together with a GPU, either in the same package (in two separete dies) or even the same die.
I think that when people talk about “integrated memory” what they mean is main memory which is soldered on the motherboard rather than coming as discrete memory modules. From the point of view of systems architecture it makes no difference, however from the point of view of electronics soldered memory can be made to run faster and soldered connections are much closer to perfect than the mechanical contact connections you have for memory modules inserted in slots.
(Quick explanation: at very high clock frequencies the electronics side starts to behave in funny ways as the frequency of the signal travelling on the circuit board gets so high and hence the wavelength size gets so small, that it’s down to centimeters or even milimeters - near the length of circuit board lines - and you start getting effects like signal reflections and interference between circuit lines - because they’re working as mini antennas so can induce effects on nearby lines - hence it’s all a lot more messy than if the thing was just running at a few MHz. Wave reflections can happen in connections which aren’t perfect, such as the mechanical contact of memory modules inserted into slots, so at higher clock speeds the signal integrity of the data travelling to and from the memory is worse than it is with soldered memory whose connections are much closer to perfect).
As far as I know nowadays L1, L2 and L3 caches are always part of the CPU/GPU die, though I vaguelly remember that in the old days (80s, 90s) memory cache might be in the form of dedicated SRAM modules on the motherboard.
As for integrated graphics, here’s some reference for an Intel SoC (system on a chip, in this case with the CPU and GPU together in the same die). If you look at page 5 you can see a nice architecture diagram. Notice how memory access goes via the memory controller (lower right, inside the System Agent block) and then the SoC Ring Interconnect which is an internal bus connecting everything to everything (so quite a lot of data channels). The GPU implementation is the whole left side, the CPU is top right and there is a cache slice (at first sight an L4 cache) shared by both.
As you see there, in integrated graphics the memory access doesn’t go via the CPU, rather there is a memory controller (and in this example a memory cache) for both and memory access for both the CPU and the GPU cores goes through the that single controller and shares that cache (but not at lower level caches: notice how the GPU implementation contains its own L3 cache (bottom left, labelled “L3$”)
With regards to the cache dirty and contention problems I mentioned in the previous post, at least that higher level (L4) cache is shared so instead of cache entries being made invalid because of the main memory being changed outside of it, what you get is a different performance problem were there is competiton for cache usage between the areas of memory used by the CPU and areas of memory used by the GPU (as the cache is much smaller than the actual main memory, it can only contain copies of part of the main memory, and if two devices are using different areas of the main memory they’re both causing those areas to get cached but the cache can’t fit both so it’s contantly ejecting entries for one area of memory and ejecting entries for the other area of memory, which massively slows it down - there are lots of tricks to make this less of a problem but it’s still slower than if there was just on processing device using that cache). As for contention problems, there are generally way more data channels in an internal interconnect as the one you see there than in the data bus to the main memory modules, plus that internal interconnect will be way faster, so the contention in memory access will be lower for cached memory but cache misses (that have to access main memory) will still suffer from two devices sharing the same number of main memory data channels.
- Comment on Chips aren’t improving like they used to, and it’s killing game console price cuts 1 week ago:
When two processing devices try and access the same memory there are contention problems as the memory cannot be accessed by two devices at the same time (well, sorta: parallel reads are fine, it’s when one side is writing that there can be problems), so one of the devices has to wait, so it’s slower than dedicated memory but the slowness is not constant since it depends on the memory access patterns of both devices.
There are ways to improve this (for example, if you have multiple channels on the same memory device then contention issues are reduced to the same memory block, which depends on the block-size, though this also means that parallel processing on the same device - i.e. multiple cores - cannot use the channels being used by a different device so it’s slower).
There are also additional problems with things like memory caches in the CPU and GPU - if an area of memory cached in one device is altered by a different device that has to be detected and the cache entry removed or marked as dirty. Again, this reduces performance versus situations where there aren’t multiple processing devices sharing memory.
In practice the performance impact is highly dependent on if an how the memory is partitioned between the devices, as well as by the amount of parallelism in both processing devices (this latter because of my point from above that memory modules have a limited number of memory channels so multiple parallel accesses to the same memory module from both devices can lead to stalls in cores of one or both devices since not enough channels are available for both).
As for the examples you gave, they’re not exactly great:
- First, when loading models into the GPU memory, even with SSDs the disk read is by far the slowest part and hence the bottleneck, so as long as things are being done in parallel (i.e. whilst the data is loaded from disk to CPU memory, already loaded data is also being copied from CPU memory to GPU memory) you won’t see that much difference between loading to CPU memory and then from there GPU memory and direct loading to GPU memory. Further, the manipulation of models in shared memory by the CPU introduces the very performance problems I was explaining above, namely contention problems from both devices accessing the same memory blocks and GPU cache entries getting invalidated from the CPU having altered the main memory data.
- Second, if I’m not mistaken tone mapping is highly parallelizable (as pixels are independent - I think, but not sure since I haven’t actually implemented this kind of post processing), which means that best by far device at parallel processing of both of those - the GPU - should be handling it in a shader, not the CPU.
I don’t think that direct access by the CPU to manipulate GPU data is at all a good thing (by the reasons given on top) and to get proper performance out of a shared memory setup at the very least the programming must done in a special way that tries to reduce collisions in memory access, or the whole thing must be setup by the OS like it’s done on PCs with integrated graphics, were a part of the main memory is reserved for the GPU by the OS itself when it starts and the CPU won’t touch that memory after that.
- Comment on Chips aren’t improving like they used to, and it’s killing game console price cuts 1 week ago:
Just to add to this, the reason you only see shared memory setups on PCs with integrated graphics is because it lowers performance compared to dedicated memory which is less of a problem if your GPU is only being used in 2D mode (mainly because that uses little memory), but more of a problem when used in 3D mode as the PS5 is meant to be used most of the time.
So the PS5 having shared memory is not a good thing and actually makes it inferior compared to a PC made with a GPU and CPU of similar processing power using the dominant gaming PC architecture (separate memory).
- Comment on The Brits had an anthem ready for when Margaret Thatcher died. Americans should also be prepared. 2 weeks ago:
Something around there being a new toilet in the country but with the music from The Sound Of Music?
- Comment on Ubisoft Accused of 'Secret Data Collection' in Single-Player Games 2 weeks ago:
In Lutris there’s a “Command prefix” configuration option both per-game and in the global config with the dfeauot for all games which is where the firejail command line goes (basically sandbox with firejail you’re supposed to run “firejail firejail-options original-command original-options” and putting firejail and its options in “command prefix” does that).
Note that there are other sandboxing options that run in the same way as firejail but I found firejail to have more straightforward options.
Also note that this won’t sandboxes the actual setup of a game, only the running of the game.
- Comment on Ubisoft Accused of 'Secret Data Collection' in Single-Player Games 2 weeks ago:
I run all my games in Linux and everything but Steam goes via Lutris which I configured to, by default, launch them inside a Firejail sandbox with no network access (plus a bunch of other security related limitations) something which I can override for specific games if needed.
It’s interesting that Steam games are actually the least secure to run in Linux and with a configuration as I have it’s literally safer to run pirated shit downloaded from the Internet.
- Comment on Winning 2 weeks ago:
I’m curious about that too.
My life experience includes environments (Physics at University level) with a significant number of exceptionally intelligent people and in my observation they weren’t any more “flawed” than everybody else, just with different quirks than most people.
Granted “smart and perceived as intelligent” isn’t actually the same as high IQ, but I’ve also worked in environments with lots of people like that (Investment banking) and again they weren’t any more “flawed” than everybody else and just had different kinds of quirks than most people.
One think I did notice was that more intelligent people tend to have more “compensation layers” over their disfunctions than less intelligent people.
That said, all this is my opinion from my own life experience, so just as unsupported as the previous poster’s.
- Comment on Winning 2 weeks ago:
Also one might be aware of the problem but not actually understand the underlying causes.
One can be a bloody genious and still be unable to self-rationalize one’s way out of certain negative behaviours because they’re driven by things at an emotional level (fear, pleasure, habit, need for approval, low self-esteem and so on), because they became entrenched as behavioural patterns when one was too young to understand any of it (as a child or teenager - it’s not by chance that a lot of Psychology “blames” one’s parents) and because without the distancing that comes from looking at it from the outside with no interest in seeing certain things rather than others (nobody wants to see elements of one’s personality as negative) it’s extremelly hard to spot certain things which for an observant trained outsider are very obvious.
Also I totally agree that one shouldn’t be going into it wanting the therapist to like you: people who worry about the impression they make on the therapist are likely not being fully open and honest about themselves to him or her, which kinda defeats the point of going to theraphy (if one was 100% perfect and all qualities, why go to theraphy).
- Comment on 34% of the US population doesn't vote. Why do polticalitcians cling to the idea that these voters can't be reached? 2 weeks ago:
From my own impression as a member of a small political party in my own country who joined not out of tribalism but simply because they seemed to mostly want the same things as I do, party members live in a bubble of people who are heavilly into politics and understand the importance of politics, whilst the leadership specifically in addition to this are also mostly surrounded by generally unquestioningly hero worship from the common party members plus they tend to have quite limited life experience outside the party as they’ve joined it as young adults (maybe when they were at university and involved in student movements) and it and its internal environment have always been a large part of their lives.
Those people usually see the supporters of their political adversaries in the same way as fans of a sports club see fans of other clubs, and don’t really “get” the point of view of people who don’t vote at all.
- Comment on Just Beware 3 weeks ago:
“Beware of shape-shifting murderous alien” would’ve required a bigger board, so it’s cheaper to put it like this.
- Comment on Former UK Prime Minister Liz Truss to launch ‘free speech’ social media platform 3 weeks ago:
Time to start live streaming a lettuce again to as the traditional Liz Truss standard for how long her projects last…
- Comment on What are some FOSS programs that are objectively better than their proprietary counterparts? 3 weeks ago:
I think that in the database space MS-SQL was never the best option at any level or at least not for long.
Oracle could be said to still be the best amongst databases for high performance and very large datasets, but in my experience in the smaller and mid-sized databases space things like Postegres and even No-SQL databases surprassed MS-SQL already back in the late 90s, early 00s.
- Comment on What are some FOSS programs that are objectively better than their proprietary counterparts? 3 weeks ago:
The Apache Web Server
- Comment on British soldiers tune radio waves to fry drone swarms for pennies 3 weeks ago:
Basically, EMP but directed.
- Comment on I hope she found herself 4 weeks ago:
“Curious how the name of the person we’re calling for is the same as my name”
- Comment on I hope she found herself 4 weeks ago:
When it comes to finding oneself, the journey matters more than the destination…
- Comment on Dear Big Tech, Stop Shoving AI Into Operating Systems 4 weeks ago:
Well, the “magical” Steam config was that stuff others pointed out that you need to in Steam actually under Settings -> Compatibility enable use of Steam Play with Proton for all titles since that’s not enabled by default.
- Comment on Do you use your blinker in a car? 4 weeks ago:
First thing: the blinkers are for your own safety, but they’re also for other people’s safety and to help improve the flow of traffic (for example, if somebody waiting at a T-junction to go into your lane sees in a timely fashion that you’re going to exit at the junction, they’ll know earlier they can come in rather than have to wait to see what you actually do and this kind of thing times many such situations adds up to better traffic flow).
Second thing: it’s a lot easier and cognitively simpler to just do it without thinking rather considering the situation to see if you should do it or not and a good mental trigger point to train that as an instinctive movement on is “if I’m going to turn out of my lane, I’ll turn the blinker on for the side I’m turning to” - so, turning of that road -> blinker on that side; changing lane -> blinker on the side I’m moving to; going to stop and park on a side of the road -> blinker on that side.
Personally I just use the blinkers for all such situations and don’t even have to think about it, and as for my first point, that just informs how early I do it (I’ve trained myself to do it quite a bit before I turn). This does mean that at times I’ll use the blinkers when there is nobody else around to actually use that information, simply because I’m not actually thinking about “should I do it or should I not?” I’m just unthinkingly executing a trained impulse. Never had any problems with excessive wear and tear of blinker lights and since I don’t need to think about it I can focus on more important things, so as I see it even end at the cost of at times using the blinkers for nobody to see whether is really no point, I’m still better of having trained myself to do it like this.
That said blinker usage amongst drivers massively depends on the country and the general driving culture there. For example were I come from, Portugal, most people only use the blinkers in situations were they stand to gain from it themselves (for example when exiting a road to the left, crossing a lane, were others might give you way, out of good manners if they know your intentions), then of the rest most will use it for the safety of other cars but almost none will do it for the safety of pedestrians, whilst in The Netherlands (were I picked up my current habits on this) they’re generally pretty thorough on using blinkers in all situations they should quite independently of seeing or not people who might use that information (possibly because of all the bicycles around, as they’re often hard to spot using the mirrors when in certain positions relative to a car which are exactly the positions were knowing that the car wants to turn is important for the safety of the cyclist)
- Comment on Dear Big Tech, Stop Shoving AI Into Operating Systems 4 weeks ago:
I had quite a lot of the same frustration because, although I was never a sysadmin (more like a senior dev who has done a lot of software systems development and design for software systems where the back and middle tier are running on Linux servers, which involved amongst other things managing development servers), I was used to the Linux structure of a decade and more ago (i.e. runtime levels and the old style commands for things like network info) and the whole SystemD stuff and a bundle of new fashionable command line info and admin tools was quite frustrating to get to grips with.
That said, I’ve persevered and have by now been using Linux on my gaming rig for 8 months with very few problems and a pretty high success rate at running games (most of which require no tweaking). Then again, I only figured out the “magical” Steam config settings to get most games to run on Linux when I was desperately googling how to do it.
Oh, and by the way, Pop!OS is a branch of Ubuntu, so at least when it comes to command line tools and locations of files in the filesystem, most help for Ubuntu out there also works with Pop!OS.
- Comment on Dear Big Tech, Stop Shoving AI Into Operating Systems 4 weeks ago:
I moved to Linux on my gaming rid (this last time around, as I’ve had it as dual boot on and off since the 90s, but this time I moved to it for good after confirming that gaming works way better in it than ever before) when I had a GTX1050 Ti, and I had no problems ^*^.
Updated it to an RTX3050 and still no problems ^*^.
The again I went with Pop!OS because it’s a gaming oriented distro with a version that already comes to NVIDIA drivers so they sort out whatever needs sorting out on that front, plus I’m sticking with X and staying the hell away from Wayland on NVIDIA hardware since there are a lot more problems for NVIDIA hardware with Wayland than X.
Currently on driver 565.77
I reckon a lot of people with NVIDIA driver problems in Linux are trying to run it with Wayland rather than X or going for the Open Source drivers rather than the binary ones.
^*^ Actually I do have a single problem: when graphics mode starts, often all I get is a black screen and I have to switch my monitor OFF and back ON again to solve it. I guess it’s something to do with the HDMI side of things.