Ubuntu is phasing out i386 (32bit) libraries altogether

Discussion in 'Offtopic Discussions' started by Klumpen, Jun 18, 2019.

  1. Klumpen

    Klumpen Run away! Run away!

    Joined:
    Jan 5, 2012
    Messages:
    6,606
    Location:
    Uncanny Valley
    https://discourse.ubuntu.com/t/i386...dropped-starting-with-eoan-ubuntu-19-10/11263

    From the Ubuntu Devel Announce mailing list:
    Unless stuff gets ported to 64bit it won't function anymore. That said 18.04 is supported until 2023 so there's still time to adapt... yet I can see a lot of problems coming my way.
     
    Last edited: Jun 18, 2019
    Tags:
    FBnil likes this.
  2. FBnil

    FBnil Ready to Champion the Pyra to the World...

    Joined:
    Dec 14, 2012
    Messages:
    2,695
    Location:
    Yurp
    Well.. the upstream from Ubuntu, Debian, still has 32bits. But 32bit support has been dropping from many Linux distro's lately, sadly.
    The 64 bits version of Wine handles 32bit Windows software just fine, so no problems from that angle.
    Running 32 bit ELF programs takes a bit of work, but seems to work too:
    https://unix.stackexchange.com/ques...e-in-a-container-inside-a-modern-64-bit-distr

    I'm curious... what is "stuff" to you? That 32 bit software to talk through serial to old routers? The 32 bit LapLink software? Proprietary Blobs? A Pandora Work Environment?
     
    Last edited: Jun 18, 2019
  3. Klumpen

    Klumpen Run away! Run away!

    Joined:
    Jan 5, 2012
    Messages:
    6,606
    Location:
    Uncanny Valley
    Every piece of software that needs the i386 libs, like WINE32.
    It's not about a 32bit environment, but about those libs not working anymore soon in my 64bit system.
     
  4. levi

    levi Still fresh, damnit!

    Joined:
    Oct 6, 2008
    Messages:
    10,970
    Location:
    Somewhere off the coast of the EU
    Interesting. I've never heard of wine32 before, although I don't use wine here because I've been using linux now for so long that all of my go-to applications run natively.

    The only reason I've come across before to have 32-bit libs installed is for cross compiling linux 32-bit applications. Archlinux dropped support for 32-bit code in their distributions two years ago (and that was i686 aka Pentiun Pro or thereabouts; they've never supported the older i386 processors). That inspired the Archlinux32 project which reinstated i686 and more recently has supported the older i486 and the newer pentium4 architectures.

    It may be worth noting that archlinux (the 64-bit only project) still supports multilib so that you can cross compile things for native 32-bit architectures more sensibly. I'm not sure where the Ubuntu project stands on this front though.
     
  5. elvissteinjr

    elvissteinjr Very Active Member

    Joined:
    Aug 23, 2013
    Messages:
    410
    Location:
    Germany
    I wonder how many games on Steam only ship with 32-bit Linux binaries... A couple of titles in my Humble Bundle have no indication of architecture on their Linux downloads so they may not be 64-bit.
    Though perhaps that's not much of an issue with Steam Runtime stuff? Not sure.

    I honestly don't think this stuff should be dropped. It doesn't really matter that much for the open source community, but still.
     
  6. Elw3

    Elw3 ƐʍlƎ

    Joined:
    Jul 21, 2013
    Messages:
    938
    As stated in the post its not a problem at all to run 32bit software on 64bit systems.
    The problem is more that certain libs wont compile to 32bit anymore since programmers get lazy and ditch them or they just stop checking them for bugs. Darktable for example was what caused me to upgrade to 64 bit since the 32bit build was unstable as hell.
    There is not much a distro can do about it since most lack the manpower to check every compiled piece of code for stability. And by now we reached the point that 64 bit is checked by the def, 32b not anymore.
    Ubuntu on the other hand ditched 32bit computers years ago, 2012 or so. So whats the fuss about it now? Their 32bit disks required a 64b compatible cpu to run so i wondered why it still existed for some time now.
    My guess is that some spin off, like lubuntu, keeps supplying us with fresh 32b builds anyway, they did it all this time with the kernel, now they jut add some libs too, so whatever. *shrug*
     
  7. ible

    ible professional vim user

    Joined:
    Mar 24, 2014
    Messages:
    2,155
    Location:
    Seattle, WA
    i'm glad they're making it easier on themselves to support future 512b instruction sets
     
  8. JDTAY

    JDTAY Half Pepperoni, All Cheese

    Joined:
    Sep 15, 2015
    Messages:
    653
    Location:
    North Carolina, USA
    I'm a novice, so I was wondering, what the point of having 64 bits worth of instructions anyway? I get that reading and writing 64 bits at once is great, but for the actual instruction set, isn't 32 bits worth of instructions, approximately four billion of the suckers, more than enough for humans? Who needs four billion squared?

    Meh, there's probably something I'm missing here. Maybe I'll google it.

    Addendum: Even 8 bit processors are Turing complete. Get a fast enough 8 bit processor and you could literally port Witcher 3 to it. I know having more than 8 bits makes things easier, but at what point does having more bits start to, you know, just complicate things? Judging by how poorly optimized most stuff is these days, I'd say we may have already hit that point.
     
    Last edited: Jun 19, 2019
  9. ptitSeb

    ptitSeb Serial Porter

    Joined:
    Aug 15, 2012
    Messages:
    8,359
    Location:
    France, near Lyon
    That sound bad :(

    Dropping legacy compatibility has always been a bad thing, and is always hurting end user.

    EDIT:
    Unless
    means there is some way to run 32bits apps without 32bits libs... Or with a minimum set of 32bits libs maybe?
     
  10. AnimatedFreak

    AnimatedFreak Very Active Member

    Joined:
    Oct 29, 2011
    Messages:
    300
    Doesn't Steam rely on 32-bit libs? I remember having to install the 32-bit version of my graphic driver to get some games working with proton. I've also had problems getting steam launching without some 32-bit libs on Debian.
     
  11. TrashyMG

    TrashyMG Sarcasm Dispenser Staff Member

    Joined:
    Jan 18, 2010
    Messages:
    10,492
    Yes it does, and some games require the 32-bit libraries from the graphics drivers..
     
    AnimatedFreak likes this.
  12. levi

    levi Still fresh, damnit!

    Joined:
    Oct 6, 2008
    Messages:
    10,970
    Location:
    Somewhere off the coast of the EU
    The main improvement for me at least is that 8-bit processors could only calculate with 8-bit registers, a native resolution of 256, which at the time tended to be the vertical resolution of the highest resolution screens so wasn't too bad there at least, but in square pixel modes the horizontal resolution was larger, which I guess is one of the reasons so few games used the square pixel mode. Now you could do addition and subtraction into two 8-bit values easily, but doing any kind of multiplication or division wasn't easy. Once we got 16-bit and 32-bit native processors you even had the resolution to do some simple 3D maths without having to resort to using multiple registers.

    On second thoughts: 64-bits is starting to get diminishing returns for my personal uses, although these days I guess video codecs and things probably use larger factorials and stuff.
     
    JDTAY likes this.
  13. FBnil

    FBnil Ready to Champion the Pyra to the World...

    Joined:
    Dec 14, 2012
    Messages:
    2,695
    Location:
    Yurp
    • More available Machine code instructions (CISC and even RISC)
    • 1 machine code instruction to handle 64 bits at once (manipulating big numbers is not slow) See @levi 's post.
    • More addressable registers available!
    • More addressable RAM!
    • More than 16/256 Colours on your screen!

    At first, 64 instructions ran slower than 32 bit instructions (due to the fact that you need double the BUS bandwidth, and that Memory had to write twice and read twice 1), the binaries were (and still are) almost twice as big because instructions and reserved memory were larger.
    Then systems got optimized for 64 bits. You can now address more than 4GB of RAM, and can have larger harddisks without having them partitioned into many small drives. You can have more machine code instructions! And thus, compilers got better at optimizing.

    Could we still be in 32bit land? Or even in 8bit land? Maybe, but not for businesses. They have money and push for bigger hardware.

    whats the maximum memory that a 32 bit machine can access?
    It is 2^32=4Gib. Try holding a modern large database in that.

    We have Bankswitching (allowed 8bit machines like the Commodore 128 and Spectrum 128 to address more RAM than it actually should). Called EMS/XMS and PAE in x86 land.
    We have LVM's where you could string together sets of disks to make a large one, and still be able to access all as one.
    The instruction set for the 8bit commodore 64 did not have rol nor ror, but actually had a whole lot of "unused" (i.e. undocumented) instructions, which, in order to not let it go to waste, were made so that you ran 2 instructions with that single instruction by using different parts of the CPU at once.
    On 8 bit machines, you can string together bytes, a double byte or quad byte (4 bytes) to represent a bigger number, or get more floating point precision. But you can present them 1 byte at a time to do calculations, so it takes a few cpu commands to, say, add 1 to a quad. However, on 32 bits, a quad is native, and can be done in 1 instruction. Much faster!

    With 32 bits the X86 could have double the amount of machine code commands from a 16 bit machine, and thus Intel thought "hey, lets use unused CPU opcodes" and made Pentium MMX. (multimedia extensions) Then AMD came with their set, etc.


    1) Speed solved by requiring an even amount of RAM chips. Dual Channel DIMM instead of SIMM

    uh.. sorry my text is so messy...
     
    Last edited: Jun 20, 2019
  14. PCXT

    PCXT Very Active Member

    Joined:
    Sep 14, 2016
    Messages:
    221
    I wonder what will happen to WINE... In Debian it's not only 64-bit, it needs some significant i386 environment to run 32-bit Windows binaries.
    The transition was quite obvious for requirements reasons - today most so-called programmers do not know how their computers work and they write CLI e-mail clients using Electron, loading gigabytes of Google's spyware into user's computer making even the best machine single-tasking. You just cannot do efficient computing on 4GB of RAM with modern software. If someone writes a "modern" program, it doesn't matter that user may need memory for other tasks, this is the only program.
    So BTW, as a typical beginner's question about switching to Linux was "isn't it too early?" (e.g. for useful productivity software) I think more and more that iIt's too late. It becomes a toy for enterprise users with access to multi-processor servers and terabytes of RAM, while useful productivity software gets abandoned.
     
    Klumpen likes this.
  15. TrashyMG

    TrashyMG Sarcasm Dispenser Staff Member

    Joined:
    Jan 18, 2010
    Messages:
    10,492
    Well who uses ubuntu anyway...
     
  16. FBnil

    FBnil Ready to Champion the Pyra to the World...

    Joined:
    Dec 14, 2012
    Messages:
    2,695
    Location:
    Yurp
    At work, I have a Linux desktop to work with. Linux, Mac and Windows desktops co-exist. For servers, Linux runs on Power (a hardware AIX normally runs on) and on x64 and Ansible is being extended to Windows for software distribution. So don't worry about Linux 64 bits... it's not a toy. Microsoft knows this (by trying to embrace-Extend-E it). Azure is basically a stripped Linux running Windows VM's on top...

    As for a good solution? by the time you really need it, at that time ReactOS is ready to sweep all users that seek 32bit Windows compatible environments.
    Meanwhile, dont "apt-get dist-upgrade" your Ubuntu.... (apt-get update/upgrade is fine though)

    All things get old. Similar to the BIOS-to-UEFI or the lack of COM1 and LPT1 ports in modern hardware. But solutions will exist. Like ISA and LPT support on modern hardware, through usb.
     
    Last edited: Jun 20, 2019
  17. AnimatedFreak

    AnimatedFreak Very Active Member

    Joined:
    Oct 29, 2011
    Messages:
    300
    Well that's gonna screw me over :(
    I might move over to Debian then
     
  18. gunrock

    gunrock Member

    Joined:
    Jan 20, 2011
    Messages:
    498
    As @Elw3 said earlier they started this rubbish years ago, when they stopped builiding kernels that run on i586 (like my Via C3). I had a MythTV running on that and that massively screwed me over. Even more annoying is that the CMOV instruction they said i586 didn't have that they needed, does exist in the revision of Via C3 that I had, but the kernel installer didn't bother to check....:(
    Anyway, that's what is suprising about this news, as i386 hasn't been supported for years. I guess they mean they will only support x64 and not x86 (i686 in Linux arch terms) anymore.
     
    Last edited: Jun 20, 2019
  19. FBnil

    FBnil Ready to Champion the Pyra to the World...

    Joined:
    Dec 14, 2012
    Messages:
    2,695
    Location:
    Yurp
    Ok, we need to clear up some things:

    If you have the following Microprocessor in your Desktop/Laptop:
    • AMD K5, K6, K6-2 (aka K6 3D), K6-3
    • DM&P/SiS Vortex86, Vortex86SX
    • Cyrix III, MediaGX, MediaGXm
    • IDT Winchip C6, Winchip 2
    • Intel Pentium, Pentium with MMX
    • Rise mP6
    • VIA C3 ‘Samuel 2’, C3 ‘Ezra’ <-- sorry @gunrock
    Then most mayor Linux verdors will drop support for it. (if you had a 286... it never ran Linux as it needed protected mode, which came starting the 386 machines. It was probably CPM, which looked similar)

    Now, 386 is a 32 bit chip, but 486 is not a 42 bit chip and 586 is not a 56 bit chip...

    For Debian:
    The users using the older i486 or i586 architectures are advised to move to Debian 8 “Jessie”, which should support the older processors until 2020.


    Now, if you have a 64 bit machine, you do not fall into that category. I repeat: You do not need to worry.

    You are able to install dosbox on 64 bits Linux and run 16 bit programs... and you do NOT require "16 bit libraries" (at least, in Linux, the Game you run might)

    As for Wine, Wine has it's own "Windows on Windows" (which reads: Windows32 running on Windows64)
    Those 32 bit Windows libraries are part of Wine and are installed with Wine. See: https://wiki.winehq.org/Building_Wine#Shared_WoW64
    https://linuxhint.com/run_32_bit_windows_64_bit_unbuntu/

    Same for Windows 10. Windows 10 is 64 bits, but through a compatibility layer, it is able to run 32 bit software... this compatibility layer has been mimicked in Wine with WoW.

    In Ubuntu you might need to access the universe repository to get it, but it will be there.

    Now the reason for this is unstable 32bit builds with the new gcc compiler. As they target 64 bits now.




    Q. Doesn’t Steam use 32 bit libraries? How can I play my games?

    Steam itself bundles a runtime containing necessary 32-bit libraries required to run the Steam client. In addition each game installed via Steam may ship 32-bit libraries they require. We’re in discussions with Valve about the best way to provide support from 19.10 onwards.

    It may be possible to run 32 bit only games inside a lxd container running a 32 bit version of 18.04 LTS. You can pass through the graphics card to the container and run your games from that 32bit environment.

    source: https://discourse.ubuntu.com/t/i386...opped-starting-with-eoan-ubuntu-19-10/11263/2
     
    Last edited: Jun 20, 2019
  20. levi

    levi Still fresh, damnit!

    Joined:
    Oct 6, 2008
    Messages:
    10,970
    Location:
    Somewhere off the coast of the EU
    You're talking here about old windows and dos software. Others are talking about linux games they received that were only shipped in 32-bit forms.

    I'm running a full suite of 32-bit software mostly built using gcc on this machine at present, and it's all working for me still. There was a bit of difficulty building many things on actual 32-bit machines to begin with as more software seems to want to use massive linker tables or something like that ISTR. I'm under the impression they actually run 32-bit gcc on a 64-bit kernel in a 32-bit chroot now, and that seems to work better.
     

Share This Page

Loading...