The Long Dark on Apple M1 Max: Actually decent gaming


Guest jeffpeng

Recommended Posts

Guest jeffpeng

Hello fellow survivors,

just to peek out of COVID-recovery-hiding I wanted to briefly communicate my findings running my favorite game on my new, ridiculously expensive but luckily corporate sponsored Mac Studio, of which I was provided the peasant base variant with M1 Max with 24 GPU cores.

Without going to much into detail: the game runs exceptionally smoothly at around 80 frames on average at the not-too-shabby resolution of 3440 by 1440 with settings on full tilt, while rarely dipping below 60 frames. All of this inaudibly silent, by the way.

So if you consider purchasing one of these, or an equivalent notebook with the same SoC, but are now torn apart by worries if you will be able to continue playing your favorite game (which, I assume, is The Long Dark) at times where you should clearly be working then be worried no more: It's a very solid gaming experience that can rival modern consoles.

If you are one of those that already owns an M1 machine and wonder how this compares to the 8 GPU SoCs: A little bit more than linear scaling, which I attribute to the vastly improved memory infrastructure on the M1 Pro/Max machines, so I would say roughly three and a quarter times as fast.

I'm not sure if this requires explicit stating, but nonetheless: I do not recommend anyone buying one of these machines solely for the purpose of gaming. Not only do many games have no Mac-port (and there is no Proton for Mac), but also would you get quite an impressive gaming PC for the price of a Mac Studio (which is $2K US, or 2300 Euros). Unless you have a reason to own one of these other than gaming (or are employed at a company that quadrupled their business over the span of 2 years and is keen on burning cash for tax evasion purposes) there are better things to do with your money.

But it's nice that it works as well as it does.

(Ah, yeah, briefly .... typical me.)

Link to comment
Share on other sites

Cool. I'm actually in the middle of using raspberry pis to save our firm a big pile of dough on workstations.

Pretty sure that this game won't run on that one... nope.

Good to hear that the prospects are good for the architecture.

 

Link to comment
Share on other sites

Guest jeffpeng
5 hours ago, stratvox said:

Cool. I'm actually in the middle of using raspberry pis to save our firm a big pile of dough on workstations.

Pretty sure that this game won't run on that one... nope.

Good to hear that the prospects are good for the architecture.

 

I played a lot with SBCs ab few years back when we actually considered entering the digital signage market (which we did not), and one board, which was fairly expensive, had a dual core A72 (iirc the RPi4 has 4 of those), which was like brand new at the time (and four A53s?), and that was actually very impressive. Could convincingly fast and smooth run a browser and office applications at 4K impressive.

But whatever Apple is doing with their ARM architectures .... they're an entirely different beast. I guess a lot of that will eventually trickle down (not only but also because everyone is buying Apple chip engineers left right and center), so I can imagine we'll see TLD run on ARM Linux eventually. It's just a shame apparently nobody can agree on a graphics API anymore.

But no, it won't run on an RPi. I mean, CPU wise.... yeah. It runs on the switch, and that has 4 A57. But the GPU of the RPi is just an order of magnitude weaker than what NV uses in their Tegra SoCs. Would actually be interesting if one could somehow convince TLD to run on a Jetson. 🤔 I mean the game is mono/C#. So that's not the issue. And apparently the Jetson/Tegra supports at least Vulkan 1.2. 🤔🤔🤔🤔 Shame they use IL2CPP, since if they didn't this would actually be an interesting project.

Addendum: The reason it runs so well on ARM Macs is .... pure voodoo. You can't do this kind of emulation unless the CPU is designed to do this kind of emulation. As far as I understand it it's a mixture of cross-compiling AMD64 to ARMv8, but also implementing parts of the x86 specification in silicon, like the memory consistence model, and then there is also raw runtime QEMU-style binary emulation as a fallback. Electron apps are notorious for falling back to that for reasons I don't fully understand, but it has to do with something funky google does with V8, and then emulations sucks just as much as it normally would. 😄

Edited by jeffpeng
Link to comment
Share on other sites

On 5/9/2022 at 6:31 PM, jeffpeng said:

It's just a shame apparently nobody can agree on a graphics API anymore.

Oh man. We're trying to figure out how to enable using graphics cards in worker systems to do (mostly) matrices and that is a massive pain. We're currently working on using webgl to enable it; it's easy for the version of our computation worker that runs in a browser but we also want to enable workers that run standalone and being able to exploit them is a challenge across the platforms. The APIs just keep proliferating. We can do Vulkan on linux and windows (and things like raspbian etc too because debian) but apple and other arm platforms are a whole 'nother world.

If you're interested, the big picture is that we're creating a way to allow people to buy and sell cpu cycles for computation, with an eye on scientific simulations. We hope to bring the current price (i.e. what people pay to rent machines at azure, amazon, and google) down significantly by allowing people to turn idle systems into computation workers for an exchange in tokens, and to have markets where people can buy and sell tokens for money. Sort of like using computers to crunch numbers to create crypto, except the idea is that the number crunching will have real social and monetary value so all that cpu time won't be for nought. We're eyeing scientific applications but we are also working with the health sector and so on to enable them to turn the office machines in hospitals etc into ad-hoc compute networks to allow them to benefit from ML apps for scheduling, or for health units to do pandemic test and trace analysis, and so on. There's a lot of potental applications for this tech, including video games; I foresee it being used for MMOGs to have the players run workers on spare cores to run the online game's "ecosystem" to help make the online worlds more "realistic".

One of my colleagues is the dude in charge of trying to corral all the various graphics APIs and every once in a while he just loooooses it. ;)

Link to comment
Share on other sites

Guest jeffpeng
On 5/14/2022 at 2:58 PM, stratvox said:

If you're interested, the big picture is that we're creating a way to allow people to buy and sell cpu cycles for computation, with an eye on scientific simulations. We hope to bring the current price (i.e. what people pay to rent machines at azure, amazon, and google) down significantly by allowing people to turn idle systems into computation workers for an exchange in tokens, and to have markets where people can buy and sell tokens for money. Sort of like using computers to crunch numbers to create crypto, except the idea is that the number crunching will have real social and monetary value so all that cpu time won't be for nought. We're eyeing scientific applications but we are also working with the health sector and so on to enable them to turn the office machines in hospitals etc into ad-hoc compute networks to allow them to benefit from ML apps for scheduling, or for health units to do pandemic test and trace analysis, and so on. There's a lot of potental applications for this tech, including video games; I foresee it being used for MMOGs to have the players run workers on spare cores to run the online game's "ecosystem" to help make the online worlds more "realistic".

One of my colleagues is the dude in charge of trying to corral all the various graphics APIs and every once in a while he just loooooses it. ;)

Oh this is super interesting. I know most people now think of crypto when it comes to using idle cycles, but I was one of those people running SETI@Home back in the day - just fascinated me.

I know we go now very far off topic here, but one thing I've been thinking about for a long time, possibly a decade or so, is how we waste enormous amounts of electricity, or more specifically: electrically produced heat, in centralized data centers. There you go real lengths (as you would know) of cooling your systems with energetically very expensive contraptions, just to go home at the end of the day and fire of up a gas or oil burner to heat your home.

One idea would be to bring the heat to the home, but this is expensive and lossy, and would require an entirely new infrastructure. Fun fact: living in Potsdam near Berlin I enjoyed centralized heating from a nearby power station - basically free heat. But that only worked because the communists back in the 70's wisely set up big ugly pipes throughout the district to facilitate that (not that I am much fond of the idea of communism, to be honest).

Now the other idea, and that's where decentralized computing in the way you are describing comes in, is to bring the computation to the home and, simply speaking, use computers as space heaters. Using electricity in itself is not efficient to make heat, but if you do computation with it which I do get paid for enough so it covers electricity and the lifetime cost of the device then ... well, not gonna say no to free heat. Plus: it's perfectly feasible to run modern silicon at 55°C perpetually, which is a very good temperature to even drive classic heaters such as you would find in homes with a centralized heating unit. So instead of a big gas burner you could run an high powered computation rig with a heat exchanger attached to it, and limit power consumption (and hence available clock cycles) to what you would require in terms of thermal output.

The advantage to mining crypto and doing the same: you'd actually do something useful - because from a social and real-world-economics standpoint mining crypto for heat is just as stupid as a space heater - and you would have a much more predictable revenue since the service buying those CPU cycles from you would almost certainly have a much lower fluctuation and a much lower risk of investment than ethereum mining has (or had ^^).

But when I looked into the issue I came across the exact problem that there are no general purpose distributed computing framework services. What Microsoft and Amazon do is sell cycles of centralized computing hardware. What such a service, as you seem to be working on, would do is sell cycles and then buy them from who ever is offering them in a decentralized manner. I know there are some blockchain attempts to solve this issue, but the computational cost of blockchain itself is very prohibitive to such a thing.

So, lot's of talk: I hope you guys succeed in your endeavor. Sounds like maybe the most interesting thing to do in IT at the moment. If what you are trying there takes off .... it very much could revolutionize how we think about large scale computing. And maybe solve one or two environmental issues while dong that. Man, it's great to live in the future.

Link to comment
Share on other sites

It is very much like that; a central inspiration is seti@home style BOINC networks but with a single language being able to be used on multiple hardware platforms: currently on the BOINC networks it's necessary to write clients for each platform you want to support. We allow a write once run anywhere with an ECMA 1.6 compatible interpreter that supports bignum, which is almost anything. We've even had a worker running on a fridge and a washing machine in the lab; they had a javascript interpreter built in because they were used to build the GUIs on the devices.

I think one of the plugs we should be using is "The Distributed Computer's machine language is javascript." 

We have also used wasm and are developing a set of libs we're calling bifrost to allow cross code generation from commonly used languages in the scientific and math communities like python and R. I'm not super up on all the deets on how all that works; I'm the guy that keeps the machines running more than writing the software for them.

If you're interested in looking at it more you can visit https://distributed.computer, https://kingsds.network, and https://portal.distributed.computer. The last in particular is where you'd set up an account and be able to run workers in web browsers to start getting tokens. We're still under very active development here but we're beginning to get some serious serious traction in both the science and biz worlds, along with some other niches, esp in health services. Of course, you can always ask me too; I can put you directly in touch with the folks doing the heavy dev on it.

And yeah, if you have a modern machine and on an architecture where we can exploit the gpu we can definitely help you heat the joint.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now