• 0 Posts
  • 19 Comments
Joined 3 months ago
cake
Cake day: December 19th, 2024

help-circle

  • CPU interrupts. There are timer interrupts that can be used for this. In hibernate, only a tiny fraction of the CPU is changing the transistor states. A transistor only uses power when it changes state; i.e. “off” or Hibernate. Transistor state changes when you cycle the clock on a CPU. Anyways, set the register for the timer interrupt and signal the CPU for Hibernate. The timer circuit is still listening to the clock while the rest of the CPU stops listening to the clock. Each clock cycle you subtract one from the register. When the register reaches zero, the timer interrupt wakes the rest of the CPU. Just like moving your mouse or pressing the power button; they signal an interrupt which wakes the CPU.


  • You gotta work on your Linux kungfu. chroot has always been around. You can install any distro, any version side by side. Now there is even DistroBox. Also, Apples switch from 680x0 to PowerPC to Intel, (Arm the exception), every time, Apple customers were told to pound sand. Imagine you spent 10 to 20 thousand dollars on hardware and equipment and software, shit adds up (it’s not just buying a Macbook Pro for these artists), just to buy it all again. That’s why Apple has always been called out for this. Windows forced updates are hilarious and have only gotten worse and worse over the years from what I hear. Linux can be updated live with no reboot. All my servers are setup like that and my work dev machine. Even the kernel gets updated live. Obviously Android and their forced data collection apps would be a huge no no for a Linux distro.

    I’d say these aren’t just “problems” with the OSes. Problems are something you can fix yourself or find a workaround. You can’t work around Windows update, thousands and thousands of dollars of investment into the Apple eco system down the drain every time Apple switches architectures, or Google’s mandatory spyware apps.

    Like, isn’t this a failure of the Wine approach (again: FOSS architecture) to keep up with hardware, more than an actual problem with using a Mac?

    Dude, you think this is about 32bit libraries. It’s about way more than that. Apple customers paid money for OSX. Why would anybody think FOSS is responsible for fixing the problem Apple knowingly created and not just one time. Keep in mind, Microsoft solved this problem with their WoW64 translations layer (like WINE, but for 32bit Windows on 64bit Windows). Linux has a couple solutions, chroot or rebuilding and repackaging the binaries. Obviously there could be a 32bit to 64bit translation layer for Linux like Windows but why when you have chroot. Same thing can be done on other Unix-like OSes. Apple should have done this for each architecture change. There was no reason to f over their customers each time. Also, keep in mind, I’m not an Apple user, not ever. So it’s them you have to convince that they, “weren’t screwed over; over and over again”. Seriously this was a joke in the late 90’s. Now it’s just reached bullshit levels. Rosetta was the least Apple could do.

    Especially when after digging in sufficiently deeply to understand it, you find that it’s actually a deficiency with Wine, not Apple.

    WINE should fix this for Apple? WINE doesn’t fix it for Windows or Linux or any of the BSD’s or any other Unix or Unix-like operating systems out there. None of them. If Apple wants to use WINE as the solution, then maybe Apple should pony up some of the money they made on OSX sales and pay some WINE developers and make WINE a first class citizen in OSX. Valve needs WINE for their OS, they came out of pocket; engineers and money. Apple can do the same. Especially for how much their customers pay. There is no justification for dumping this on FOSS to fix Apple’s mess.


  • It’s the same with Windows. I worked on minkernel and onekernel. There are a ton of pre-processor directives for different cpu’s and all kinds of hardware pre-processor directives. Even pre-processor directives for different companies. Unused code paths are eliminated during compile time. The pre-processor directives are more of an annoyance for the developers anyways. If you didn’t organize your code, then you get what you deserve.


  • Naa, I used to be Windows kernel dev for Intel. The backwards compatibility comes from the WoW64 translation layer in 64bit Windows. WSL1 was built off the same framework. WoW64 worked pretty good back then. Apparently there is enough drift now that you’ll get the occasional old game or program that doesn’t translate well under 64bit Windows anymore but, works under WINE. Rare, but still good for a laugh.

    We could do the same under Linux, but it just seems pointless for Linux. Linux has chroot. Steam Runtime is based on containers which are based on chroot. Chroot is how we played 32bit games on 64bit Linux distros back before distro maintainers started including 32bit libraries with their 64bit install images.


  • highball@lemmy.worldtolinuxmemes@lemmy.worldI still like this meme
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    7 days ago

    When distro maintainers started building and shipping 64bit versions, they didn’t include 32bit libraries. You had to make a chroot for a 32bit distro, then symlink those libraries in among your 64bit libraries. Once distro maintainers were confident in the 64bit builds, they added 32bit libraries. In the case of Windows, Microsoft created a translation layer similar to WINE called WoW64 (Windows on Windows64). Apple is the only one who said, fuck you buy new software, to their customers. Rosetta is the first time Apple didn’t tell their customers to go pound sand; probably not by choice.




  • Don’t fall for his flame bait. Linux is the number one used OS in the world. Linux dominates every market except Console and Desktop. Once Microsoft can no longer use vendor lockin to artificially maintain it’s grip on the Desktop market, you’ll see all kinds of engineering dollars poor into Desktop Linux from OEMs. Look at OSX, flopped in the Server market (dispite being “technically” Unix). Apple shut down an entire division (XServe), because OSX Server sucked so bad. Azure is getting dominated by Linux. Linux has 80% of the IoT market despite Windows being free for IoT. OSX and Windows only exist because of Vendor lockin.



  • highball@lemmy.worldtolinuxmemes@lemmy.worldIdc
    link
    fedilink
    English
    arrow-up
    2
    ·
    28 days ago

    Correct. Azure Linux. They’ve been slowly adding to their Linux distro piece by piece over the years. It’s more expensive to run Windows in the cloud than it is Linux. My bet is, Office 365 will one day give you Azure Linux with a Windows userland and a Windows DE. 90% of the users probably wouldn’t even know the difference. The few folks whose programs actually need Windows will probably just fall back to full Windows while the rest of everybody just uses Azure Linux; saving Microsoft millions.



  • It’s called a terminal emulator because it emulates graphically what used to output to a printer at the console of a mainframe. Then you got CRT monitors. The mainframes like the PDP-10 would output to a printer or CRT monitor. This was your terminal. A printer writes the output from the mainframe 1 character at a time, left to right, top to bottom. The CRT monitors were made to do the same. Obviously before outputting to a printer or CRT monitor, the output would show on a set of lights on the console. If you watched them change enough, you would know where you were in your program as it ran (obviously something only doable because the opcodes were not running in parallel through super scalar pipelines in the Ghz). With printers and monitors, you could increase the amount of feedback you get from the running or exiting program and give input to the system via a keyboard.

    So, the terminal is not “technically” a GUI. We do use a GUI to emulate a terminal which receives the actual terminal output from the system and then displays it for you. They are not the same thing at all. GUI is a paradigm for what you display on a Monitor for the user to interact with. Modern monitors are fast enough that they can and do work well with the GUI paradigm. You definitely wouldn’t be sending GUI context to a printer.




  • highball@lemmy.worldtolinuxmemes@lemmy.worldJumping Steps
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 months ago

    Been using Linux for several decades now. I’ve always been able to throw in a floppy or a CD, or now a thumbdrive and just boot up and easily fix what’s wrong. Plus it’s rare to even have to do that. The times I’ve used Windows, when things go wrong, if it’s not a simple fix, best you can do is format and reinstall. I have friends who are so numb to that. But they figure, they might as well since they’ll just have have to format Windows and reinstall anyways because, Windows gets slower over time. I have one friend who had it on his calendar to just monthly reinstall Windows. I’ve never once thought, wow Linux is getting slow, let me format and reinstall. I mean, how can that even be an acceptable solution to anybody. Sure, if things just went sideways so badly and everything is corrupted, but that would be one hell of an extreme exception.




  • highball@lemmy.worldtolinuxmemes@lemmy.world2025 baby
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 months ago

    The only markets Linux doesn’t Dominate are the Desktop and console space. The only thing holding back Desktop domination is Microsoft and it’s vendor lock-in strategy. It says a lot when Microsoft has to use the power of their purse in order to maintain their position. Even Linux dominates in the IoT space with ~80% of the market, despite Microsoft having to make Windows IoT free.