initramfs 2 days ago

https://j-core.org/

"What is this processor? The SuperH processor is a Japanese design developed by Hitachi in the late 1990's. As a second generation hybrid RISC design it was easier for compilers to generate good code for than earlier RISC chips, and it recaptured much of the code density of earlier CISC designs by using fixed length 16 bit instructions (with 32 bit register size and address space), using microcoding to allow some instructions to perform multiple clock cycles of work. (Earlier pure risc designs used one instruction per clock cycle even when that served no purpose but to make the code bigger and exhaust the encoding space.)

Hitachi developed 4 generations of SuperH. SH2 made it to the United states in the Sega Saturn game console, and SH4 powered the Sega Dreamcast. They were also widely used in areas outside the US cosumer market, such as the japanese automative industry.

But during the height of SuperH's development, the 1997 asian economic crisis caused Hitachi to tighten its belt, eventually partnering with Mitsubishi to spin off its microprocessor division into a new company called "Renesas". This new company did not inherit the Hitachi engineers who had designed SuperH, and Renesas' own attempts at further development on SuperH didn't even interest enough customers for the result to go ito production. Eventually Renesas moved on to new designs it had developed entirely in-house, and SuperH receded in importance to them... until the patents expired."

  • mikepurvis 2 days ago

    Interesting point of history— the H8 processor is the MCU that powers the original Lego Mindstorms RCX. In high school I wrote some assembly language for it when making a robot that ran on BrickOS:

    https://en.m.wikipedia.org/wiki/BrickOS

    • kn100 2 days ago

      a slightly different (but close enough) Hitachi CPU also powered the Cybiko - a wacky games console few have heard of. I tried writing a disassembler for that particular cpu a while back. Was an interesting platform and oh man the documentation at least to me was gorgeous: github.com/kn100/cybemu/

      • dagmx 2 days ago

        Link to the cybiko https://en.wikipedia.org/wiki/Cybiko

        That is a really wild design

        • phs318u 2 days ago

          Love it!

          “The concept for the device emerged from social research conducted in six countries, which identified a need for digital communication among youth. … a radio protocol was patented. This protocol allowed up to 3,000 Cybiko devices to form a network without using auxiliary stations.”

          Fascinating!

          • cpach 2 days ago

            Wow. That’s basically a smartphone!? Just way ahead of its time.

            • xattt 2 days ago

              It only needed a school population geeky enough to have Cybikos. I wanted one, but no one else in middle school had one!

            • hfgjbcgjbvg 2 days ago

              Apple might as well do this and cut out the carriers.

              • xattt 2 days ago

                Is this what the Thread radios in all Apple devices are hiding?

                • teruakohatu 2 days ago

                  Sadly Thread is 2.4ghz rather than the much lower and longer range Lora frequencies. It’s designed to be used within the home and between homes.

              • asveikau 2 days ago

                Sounds a bit like LoRa.

                • ruined 2 days ago

                  it's the same 900MHz band

            • Fnoord 2 days ago

              I mean, it was not a smartphone but a PDA and it ran Linux: Sharp Zaurus. I owned a CL1000 (later on I went to Nokia N810, though in hindsight the N800 was just as cool with dual SD). Magnificent devices, and back then I liked they didn't have GPRS or 3G. Such was slow anyway (and expensive connection) so I felt that saved me money. Besides, the WWW was dominated by MSIE. And no GSM meant to me less tracking.

            • samatman 2 days ago

              More interesting than a smartphone in some ways.

              Smartphones can form mesh networks with WiFi, but neither of the duopoly OSes have this as a built-in feature, it's left to apps, and that fragments the potential. The most powerful radio isn't user-controllable, it's strictly pay-to-play and operating a base station is heavily licensed, no peer-to-peer activity is possible.

              This is something I'd like to see disrupted, although I'm not holding my breath. I don't like that grid failure or (more likely) government order can knock smartphones off the network so easily as they can.

              • Fnoord 2 days ago

                Back in the days (very early 2000s), Amsterdam got WLAN network called Wireless Oost. This gave people in Amsterdam Oost wireless internet access throughout the neighborhood, without needing cable internet from A2000 (later merged with Chello, who merged with UPC, who merged with Ziggo, all the time owned by a certain company known as Liberty Global (you might have heard of them)). It also served as a (W)LAN. Back then, the local computer club ASCII (comprised of political activists and such, predecessor of the hackerspaces) had a cargo bike called Bakscii (derived from Dutch world of cargo bike (bakfiets) and ASCII). This provided WLAN access at demonstrations and such. It was a neat project which Internet Archive likely still holds.

                I was never directly involved with said project, btw, but I knew all of the people who were. Nowadays, I live near Amsterdam and will get 1 gbit symmetric fiber internet soon (most of NL already has access to it). I use a Wireguard tunnel to connect via any network (LTE/NG/WLAN) via my home network which runs Pi-Hole. My point being, security has improved, but there is a constant: wireless internet usage can be triangulated, eavesdropped, and clients can be tracked. It is something we need to live with. Every time I leave my smartphone at home even for something as small as picking up my kids at school or sports I feel good. However I can easily be tracked by all the doorbell cameras in the street.

                We are done with subversion on the internet. It is over, a done deal. I've given up many years ago, and eventually I also embraced the thought of such. If you want some private time together go to some cave in Turkey or whatever.

    • immibis 2 days ago

      My CASIO graphics calculator from high school also uses a SuperH.

      • voidbert 2 days ago

        And interestingly for a graphing calculator, it lacks an FPU, and all floating point math has to be done in software.

        • epcoa 2 days ago

          Not particularly unique at all. Handheld graphing calculators typically were not intended for use where hardware floating point was necessary. TI calculators historically used the Z80, no FPU. Earlier HP calculators did use a custom BCD based (not IEEE954) floating point ISA, but these are still slower than just about any processor in the past 30 years doing software floating point. They didn’t have any hardware multiply or divide for one, this is not an FPU in the modern sense people envision. And later on the HP used common ARMv5 based processors with emulation. None of these ARM processors had hardware floating point. Same with later TI calculators that were 68k and then ARM based.

          The HP Prime G2 released in 2018 is about the only mainstream device that happens to have hardware FP, and that’s for a device more touted for CAS features. The FPU is more just something that comes for free with the commodity SoC chosen.

          • rbanffy 2 days ago

            > None of these ARM processors had hardware floating point

            More importantly, IEEE 754 floating point ISA is not great for calculators - they don’t require the speed a personal computer does, and calculators work hard to hide the fact numbers such as 0.45 can’t be represented as IEEE 754 floats.

        • immibis 2 days ago

          It uses decimal floating point, so that's no worry.

    • chaosite 2 days ago

      In popular culture, the Hitachi H8 microprocessor was referenced in the song Space Dementia by Muse.

      > Q - "What does "H Eight" mean?"

      > Matt [Bellamy]: Using a microcomputer (Hitachi H8 / 3048F) which can be built into the industrialmachines, you can learn and understand the inputs /outputs of the microcomputer as a basis of robot control and conduct theexperiments by C-language for steppingmotor control, servomotor control (PWM control) and serial communication. H8 model, a 16-bit microcomputer consists of 32-bit registers, has a flash ROM of 128KB, a RAM of 4KB (SRAM) with external extension of 128KB and 78 I/O terminals with the built-in A/D and D/Aconverters. H8 is a microcomputer usually built into a TV, VTR, mobile-phone and car navigator. Since it has ample I/O terminals, H8 microcomputer is also used as a brain of a small robot.

      [0] - https://web.archive.org/web/20160406073458/https://www.micro...

    • Tor3 a day ago

      There was a SuperH in my parents' Panasonic TV - I found the GPL license and kernel info deep in the menu system.

  • sspiff 2 days ago

    I had a SuperH 3 powered HP Jornada that I ran Linux and NetBSD on back in the day. Not particularly fast, but power efficiency was off the charts, even when compared to the many contemporary ARM and MIPS based devices of the time.

    Or at least that's how my nostalgic memories think of it.

    It was really fun to have a pocketable laptop back in those days for me (baggy pants required). Good times.

  • phire a day ago

    > Hitachi developed 4 generations of SuperH.

    Technically it's only three generations, as the SH-1 and SH-2 were launched at the same time. The SH-1 was a stripped down basic model, missing a few instructions like 32-bit multiplication, while the SH-2 had everything.

    The Sega Saturn had two SH-2s for the main CPU, and one SH-1 as a CDROM controller. it also had a 68EC000 for the sound subsystem... and a DSP.... Too many CPUs.

  • hulitu 2 days ago

    > spin off its microprocessor division into a new company called "Renesas". This new company did not inherit the Hitachi engineers who had designed SuperH

    This explaines why Renesas' products are so bad and why the datasheets are terrible.

    • joezydeco 2 days ago

      I tried to evaluate their intro Linux processor (RZ/A) a while back and the amount of support we got from the factory was dismal. I got an FAE to confess that the two factions (Hitachi vs NEC) didn't get along. They have a bread-and-butter product line in the RL78 but we dropped the idea of using them for anything else.

  • farmdve 2 days ago

    I have done extensive reverse engineering of SH2 firmwares, at first I didn't like it, but then found it elegant.

  • asveikau 2 days ago

    > developed by Hitachi in the late 1990's.

    I remember hearing about SuperH in the early 90s. Wikipedia says 1992.

    > in the Sega Saturn game console,

    Wasn't that around 1995? Already too early for "late" 1990s.

    • giantrobot 2 days ago

      The Saturn development was started around 1992 and was released in Japan in 1994.

  • DaoVeles a day ago

    I only ever had a very brief hands on time with SuperH based systems (excluding the excessive amount of Saturn/Dreamcast play time). They seemed to have efficiency as their very core goal. So while they were not the fastest processors around, they squeezed out every little drop from what they had to work with.

  • publicmail 2 days ago

    The ECU in my 350z also used a SuperH CPU - I think SH2?

PhasmaFelis 2 days ago

I was gonna say, the same thing that happened to all the western PC platforms that weren't Microsoft or Apple. Commodore and Atari and Acorn and Sinclair and Dragon and probably dozens of others I've never heard of. As computers became more powerful and development costs rose, small-market architectures and OSes simply became unsustainable. You had to either reach sustained global success or die.

I'm sure there were some unique challenges for architectures that mainly served Japan, but I doubt they were that much worse than the ones facing the ones that mainly served, say, Britain. All of them lost the race in the end.

The same thing happened again with graphical cellphones! In the flip-phone era there were a zillion different OSes with their own app libraries. For a while it looked like Blackberry was set to be the Microsoft of the upcoming smartphone era, and then Apple stole their thunder, and no one could compete except Android and Windows Phone, and then Windows Phone dropped out too, and now we're back to two basic architectures with no meaningful competitors, just like the home PC market by 1996.

> By 1994 though, they had a problem: the 32-bit consoles were out, which could do 2D games just as well as the FM Towns and X68000, and the consoles could also do 3D that blew away anything those computers could handle.

This line from the article caught my eye in particular, because it's similar to what happened to Commodore's Amiga, one of the last real Microsoft competitors in the West. Essentially, Doom killed it. There's a rather tragic list of Amiga games that struggled valiantly to be Doom on that platform, and some of them were pretty good but none of them could really match what Id could with a tricked-out DOS machine in 1993, and that was more or less that.

  • TacticalCoder 2 days ago

    > Commodore and Atari and Acorn and Sinclair and Dragon and probably dozens of others I've never heard of.

    Let's not forget the french. They had Oric, Matra (IIRC), Thomson and... The Minitel.

    And now they're absolutely nowhere to be seen, just like Japan: it's either PCs (basically running Windows or a Un*x) or Macs.

bane 2 days ago

I think this somewhat misses an important nuance. Japanese PCs had to be different early on because of the complexities of the written language. All of the important characters could be handled in just a few bits (7 or 8) and low resolution in Western markets, with different fonts and character maps dropped in to support a few different alphabets.

But in CJK countries, things were much harder and the entire I/O system had to be significantly more capable than what might pass for usable elsewhere. This meant larger ROMs, larger framebuffers, higher resolution displays, more complex keyboarding systems, the works. Everything was harder and more expensive for a long time. A common add-on was ROMs with Kanji (Chinese derived characters) support in the same way a person in the West might buy a new sound card or get a VGA card. Except this was just so you could use your new $1200 computer (in today's money) to write things on.

Back then, given limited memory, you also ended up with a ton of different display modes that offered different tradeoffs between color, resolution, and refresh. Because of the complex character sets, these Japanese systems tended to focus on fewer colors and higher resolution while the west focused on more colors at a lower res in the same or less memory space (any fans of mode 13h?). The first PC-98 (the 9801) shipped in 1982 with 128k of RAM and a 640x400 display with special display hardware. The equivalent IBM-PC shipped with 16KB of RAM and CGA graphics which could give you a display no higher than 640x200 with 1-bit colors but was mostly used in 320x200 with 4 (terrible) colors.

Even with similar base architectures, these formative differences meant that lots of the guts of the systems were laid out different to accommodate this -- especially in the memory maps.

By the time "conventional" PCs were able to handle the character display needs (sometime in the mid-90s), they were selling in the millions of units per anum which drove down their per unit prices.

The Japanese market was severely fractured and in a smaller addressable market. Per unit costs were higher, but the software was largely the same. Porting the same businessware to half a dozen platforms cost too much. So now the average user of the Japanese systems had a smaller library of software which was more or less a copy of what was on IBM PCs, on more expensive hardware -- market forces solved the rest.

(btw, the FM Towns, IIR, also had specialized graphics hardware to produce arcade-like graphics with tiles and sprites and so on, making it even more different)

Some of this history also informs why home computing lagged in Japan compared to the West despite having all of the other prerequisites for it to take off.

graphics

https://www.pc98.org/

memory maps

https://radioc.web.fc2.com/column/pc98bas/pc98memmap_en.htm

https://wiki.osdev.org/Memory_Map_(x86)

  • tkgally 2 days ago

    Excellent summary. A few additional comments from personal memory:

    I have lived in Japan since 1983, and I started working as a freelance Japanese-to-English translator in 1986. I wanted to produce clean-looking text in English for my clients, so after a few months using a manual typewriter I took out a loan and bought a Macintosh with a dot-matrix printer. If I remember correctly, it cost six hundred thousand yen. The Mac could not handle Japanese; when I needed to write Japanese text, such as for notes to clients, I wrote by hand. I eventually bought a dedicated Japanese word processor for writing clean text in Japanese.

    Around 1992, I bought a modem and went online, first to a local foreign-run BBS and then, a couple of years later, the Internet. Many of the first friends I made online were Japanese-English translators like myself, and some of the most active discussion groups I took part in were about the Japanese language and translation.

    The display of Japanese characters in our online discussions was a problem for a long time. Even as more and more of the participants became able to type Japanese on their own computers, they were using a variety of OSs and character encodings, and the Japanese parts of their messages, when posted online, would be corrupted more often than not. When discussing a particular Japanese expression, we would have to romanize the Japanese and, sometimes, explain what kanji were used.

    Here’s are two examples from posts to a translators’ mailing list in 1998:

    > While this handbook uses "åòå≈ê´" for "robustness", the systems engineers I work with prefer "ÉçÉoÉXÉgê´" <robasutosei>.

    > Ruth, the kanji for taikou are tai (as in taishi - Crown Prince) and kou (as in kugurido - the radical is mon with gou inside (gou = au/awasersu). Does this help? The dictionary meaning obviously does not make sense here.

    This made it impractical to discuss longer texts or to have our discussions in both English and Japanese.

    It was a great relief when, around 2000 or so, the encoding issues were gradually resolved and we became able to write Japanese freely in our online discussions.

    (Addendum: I am still in touch with some of the people on that mailing list, including the Ruth mentioned above. In fact, last month I attended a party in Yokohama in honor of her and her husband’s 55th wedding anniversary. Several other friends I first met online in the mid-1990s were there, too.)

    • bane 2 days ago

      Oh wow, that's a great personal story.

      I would imagine things begin to improve around 2000 due to the broad adoption of unicode? I remember there being an absolutely huge number of encoding systems for the various CJK languages back then, but I think Windows eventually guessed/settled on UTF-16 IIR.

      I didn't live in Asia during this time, but was heavily involved in writing some multilingual capable desktop windows software and was very aware of these challenges. I remember one colleague who worked on our Chinese language material having to buy an expensive copy of a Chinese British telegraph code book.

      • tkgally 2 days ago

        > I would imagine things begin to improve around 2000 due to the broad adoption of unicode?

        Yes. I don't remember the exact timing, but it was the switchover to Unicode by Windows and Mac that finally resolved the character encoding conflicts (mostly).

        I don't know how much attention this got outside Japan, but there was significant opposition here to Unicode for a while from some Japanese intellectuals. Handwritten and, sometimes, typeset Japanese has traditionally allowed for a lot of minor variations in the forms of kanji. A person might write their name with an extra stroke in one character, for example. Some of those variants are ignored in Unicode as well as in the Japanese encodings on which it is based, and some people kicked up a fuss about it in the Japanese press. I remember reading rants accusing Bill Gates of trying to suppress Japanese culture by imposing a homogenized character encoding through Windows.

        The controversy has long since died down, though there have been efforts to develop more comprehensive character sets of Japanese and other Asian languages, such as TRON Code and Mojikyō:

        https://en.m.wikipedia.org/wiki/TRON_(encoding)

        https://en.m.wikipedia.org/wiki/Mojiky%C5%8D

        • samatman 2 days ago

          Oh it gets attention on Hacker News to this day. Any time there's a thread about something Unicode one or another of our Japanese colleagues will show up to make a fuss about how Unicode is quite insufficient for Japanese, for this reason or that.

          In 2024 I'm out of sympathy for this. There's an entire tertiary plane for extended Hanzi/Kanji/ideographs, if there are still missing glyphs please take it up with the Unicode Consortium, because there's plenty of room and it is reasonable to support one (1) character encoding now.

          But it is in fact Microsoft's fault that the farcical attempt to squeeze "CJK" into a two-byte encoding was even attempted. They were firmly committed to the idea that UCS-2 could ever be viable and we're stuck with various consequences of this fatuous premise to this day.

      • oefrha 2 days ago

        In fact, I occasionally run (currently maintained) CJK commercial software on my en-US Windows installation and I still run into ??? from time to time and have to guess what the text is supposed to be. It’s a shitshow.

        • pezezin 2 days ago

          Heck, the stupid Windows application for the "my number card" has this problem, and it is an official application coming from the government itself! How is it possible that in 2024 we still have this problem?

          • poincaredisk 2 days ago

            Between 2010 and 2020 I've ordered something from Amazon around 5 times, and each time they spelled my legal name wrong, each time in a different way (due to encoding issues). And I'm an european, living in a large country. So I'm not surprised CJK languages have this problem.

          • creakingstairs 2 days ago

            Yeah and a part of e-tax literally asks you to have Japanese Windows in fine print or it fails silently without any warning messages (Fixed by changing chrome locale to Japanese thankfully). Gotta love Japanese websites. I just go into the office and hand write the forms because it beats trying to debug cryptic issues.

            • FMecha 2 days ago

              Many Japanese PC games and desktop applications aimed at the home market, from small to large developers, also do sort of these things. Just look up the word "AppLocale".

        • Laforet 2 days ago

          There is a system wide setting that changes all non-Unicode text encoding to another code page e.g CP932 for Shift-JIS. Third party tools are available to do the same conversion on a per application basis.

          It’s not as bad as trying to load some really old CJK web pages on mobile devices: few mobile browser has an accessible option to select character encoding and there appears to be none on iOS. The only option is to change the system language and that didn’t always work for more obscure character codes.

    • mappu 2 days ago

      What a wonderful story.

      I spent a while playing with `iconv` commands to solve your mojibake, reinterpreting bytes in and out of Shift-JIS, but I didn't get it - i'd love it if anyone managed to figure out the exact encoding,

  • TheAceOfHearts 2 days ago

    Going way off the beaten path, this post makes me realize how lucky the West was to be able to develop practical computers when they did. And it also has me speculating how challenging it could be for an alien civilization to develop modern computing, depending on the complexity of their interface with reality. English is surprisingly utilitarian and straightforward for a first pass at a computing device. But one could imagine some alien civilization with far more foreign concepts of communication struggling to develop practical input devices and displays during a primitive computing era.

    • Maken 2 days ago

      It's not the English language but the latin alphabet that helped there. It had already been optimised in ancient times to a limited set of easily readable characters, and then was further standardised by the introduction of the printing press. Going from a collection of metal pieces to a collection of bitmaps is way easier than adapting a handwritten languaje (which already had problems adapting to the printing press).

      • bane 2 days ago

        To add to this, there are some specific characteristics or acceptable practices within European alphabets (by the time that computers were invented) that helped.

        - The total number of characters needing representation is small. In the most limited case, you can get by with just the majuscule (upper case) characters. This means you can represent all letters in just a few bits. Representing all upper + lower + numbers + basic punctuation for almost any single language in Europe, even non-latinate languages is possible in just 7 or 8 bits!

        - The glyphs are linearized in a consistent direction. So data storage to display logic is very simple. This is opposed to languages like Korean where characters are assembled into syllables and text can flow left to right horizontally or right to left vertically.

        - Most European languages have both print and cursive forms. By the 20th century it was acceptable to use the disconnected print form, which made display logic much simpler (just copy the 8x8 bitmap in ROM to the screen). But importantly, special semi-ornamental characters, like ligatures, were acceptable to separate.

        - A more or less universal base-10 number system meant that we only needed to include 10 digits in the character set. Arithmetic only requires a handful of symbols. So you can include a pretty big subset of math in your character encoding. This is opposed to other systems that use distinct characters in ways similar to Roman numerals (e.g. Chinese numerals which also have financial and lay versions of counting).

        - Punctuation and simple arithmetic characters had been basically boiled down to about a dozen glyphs.

        - If you use 8-bits for your character encoding, you have so much space you can also include pretty much every character variant needed for all Latin using languages, allowing you to hit an addressable market of billions of people with the same 16k ROM.

      • smegger001 2 days ago

        >which already had problems adapting to the printing press

        as i recall the printing press was developed in the East first. Each page was essentially a wood cut and pressed on to the page, the major innovation that Gutenberg in the west had was movable type more than the press.

        • qwytw 2 days ago

          > was movable type

          And even then, Chinese had already used movable type ~400 years before Guttenberg, possibly even for "serial numbers" on printed banknotes. The oldest surviving book printed with movable type was published in the late 1370s in Korea: https://en.wikipedia.org/wiki/Jikji.

          Even woodblock printing, while it was seemingly know in Europe or at least the Byzantine Empire since the 11-13th centuries only became heavily commercialized and widespread around the same time as Gutenberg's press pretty much in parallel with it (for playing cards, illustrated books etc.).

          There was something special about Europe in the 1400s. I'm not downplaying his skills/ingenuity but Gutenberg's greatest achievement was doing what he did at the right time and place. It seems there wasn't enough demand for books/printed materials that would have justified the needed investments anywhere else.

          • wrp 2 days ago

            Another thing about the development of movable type. I've heard that Gutenberg's main contribution was the development of an alloy that made metal type feasible, and that the (earlier) Korean solution was to use ceramic type.

            • qwytw 2 days ago

              They also tried bronze and copper as well (mainly for money which used a mix of plate and movable type).

              The transition from wood to bronze metal type might have independent in Korea, though. They even had a pretty extensive printing industry as far as we can tell but it was fully controlled by the state/emperor and non official printing was illegal.

              I'm certainly not an expert but at least in China's case the type of inks they used didn't really work well with metal or even ceramic type which probably made the process relatively inefficient.

              Butyeah, Guttenberg invented a new alloy, ink, moulding method which made the process much cheaper and more effective.

          • christkv 2 days ago

            I think it coincided with a growing demand of reading material as the merchant class expanded.

        • _DeadFred_ 2 days ago

          I think at this point we understand that 'inventing' or 'discovering' means coming up with the solution/discovery that took off in modern culture/usage in a way that shaped our current world.

  • FMecha 2 days ago

    >(btw, the FM Towns, IIR, also had specialized graphics hardware to produce arcade-like graphics with tiles and sprites and so on, making it even more different)

    I originally thought might have confused it with the X68K, whose it's closeness to arcade hardware such as Capcom CPS-1 allowed for near (but not fully, contrary to popular belief) perfect ports possible at the time. (Capcom even sold a Genesis/SNES controller converter for X68K for the Street Fighter II ports.) However, the FM Towns does have its own share of arcade ports, mainly involving Sega, Taito and Capcom titles, such as After Burner, Operation Wolf and Super Street Fighter II, respectively.

    On FM Towns side, their commercial games list appears to be intermediate between X68K's "arcade at home" experience and PC-98's graphics that prioritize RPGs and visual novels (often smut in nature) due to PC-98's graphics capabilities that benefitted static graphics. (No PC-98 mention/discussion is complete if eroge is not brought up at one point anyway.)

    • bane 16 hours ago

      The X68k was an absolutely amazing line of machines. Beautiful industrial design in and out.

      The FM Towns relied on Intel CPUs, but the rest of the hardware was designed more or less like any other Japanese system of the era with specialized graphics and sound hardware. I believe it also was capable of booting off of CDs and running the software directly without having to install anything to an internal hard drive.

      The specifications of the graphics and sound hardware were bleeding edge for sprite-based hardware of the time and included some really incredible graphics modes that far exceeded normal VGA plus had some really arcade level sprite hardware which the IBM-PC graphics hardware never offered.

      The audio hardware was likewise well in excess of contemporary PC soundcards. It also mirrored arcade level hardware in capability.

      In some ways, the FM Towns was as much a general purpose Personal Arcade as it was a Computer.

      https://en.wikipedia.org/wiki/FM_Towns

      • FMecha 15 hours ago

        >The FM Towns relied on Intel CPUs, but the rest of the hardware was designed more or less like any other Japanese system of the era with specialized graphics and sound hardware.

        This was also true of PC-98, too, to some extent.

  • charlieyu1 2 days ago

    Taiwan had similar problems for Traditional Chinese in the early days. Their solutions were specific hardware cards in the beginning, then software-only OS compatible with IBM. It is not necessary to reinvent the wheel.

  • ghaff 2 days ago

    Even in the larger commercial computer space, Japan always liked to sorta do their own thing. Aside from a couple other companies, they were always big Itanium backers for example.

    I was an analyst during that period and Japan was always something of an outlier. (Europe was to some degree as well. But less so.)

    • bonzini 2 days ago

      Regarding Itanium, that could be just by chance because Fujitsu is the main (or almost only) seller of supercomputers in Japan partnered with HP. Also interesting however is how they switched to it from SPARC and actually kept using SPARC longer than the west.

      • ghaff 2 days ago

        NEC and Hitachi were involved as well. There was some sort of dynamic of collectively wanting something different from commodity x86.

  • Dwedit 2 days ago

    DOS/V (Not to be confused with the similarly-named MS-DOS 5.0) is what made standard PCs able to run Japanese software. It provided a software emulation of a full Japanese text mode with Kanji, just requiring VGA.

  • aurizon 2 days ago

    Your analysis is correct. I recall in the mid 80's a number of Japanese PC's emerged. They all seemed to want to make a walled garden so they could sell add-ons into it for extra $$, while, as you say 'the hills are alive' with mass made items for the emergent PC standard enabled by Microsoft's provision by sale/licence of MS-DOS while preserving their ability to sell fully compatible same-ware. There was only a small hurdle, the IBM BIOS, which IBM lawyered to death any and all copiers. Soon clean-room original BIOSes were created by a few companies and the gold rush was on. IBM had lost all hope, but sold enough to business and governments to make a good business. They tried a walled garden with OS2 and their MCA(Micro-channel architecture) bus. All US PC makers jumped on this, millions of plug in cards were made - buyers - crickets, crickets, crickets, I suspect many billions went down that rabbit hole. I recall one scrap salvager processing tens of thousands of Zenith MCA boards for gold a few years later. The Japanese finally saw the light and focussed on laptops and those had a good run, but finally they faded, IBM also made their think-pads, but eventually sold to Lenovo. Us makers - Dell, Apple and HP persist - I am not sure which are US made? https://en.wikipedia.org/wiki/List_of_laptop_brands_and_manu....

  • ViktorRay 2 days ago

    Very interesting! Thanks for posting this!

pyeri 2 days ago

Anyone remembers Toshiba laptops? Their build quality was top-notch and they were quite durable at the price point they came. Then around 2012-14, they started disappearing from the market, what happened to them?

  • agumonkey 2 days ago

    On a different niche Panasonic with their toughbooks secured a solid spot.

  • lifeformed 2 days ago

    They rebranded to Dynabook I think?

    • Dalewyn 2 days ago

      They were always called Dynabooks by Toshiba[1] and still continue to be sold today, but they retreated to only selling in Japan a long time ago and Toshiba ultimately sold the Dynabook brand off to Sharp in 2018 as part of selling off most of their businesses to avoid bankruptcy.

      For those who aren't aware, Toshiba is the quintessential example of Japanese Exceptionalising Into Failure(tm)[2].

      [1]: https://en.wikipedia.org/wiki/Dynabook_Inc.

      [2]: https://en.wikipedia.org/wiki/Fujio_Masuoka

      • Tor3 2 days ago

        I still have a Toshiba Dynabook lying around. Big and heavy, but otherwise a fine laptop.

        • dunghill 2 days ago

          Got a Dynabook too. It's a tank.

    • pyeri 2 days ago

      Just Googled and came to know they sold their stake in PC division to another Japanese company called Sharp in 2018, this Sharp then later rebranded as Dynabook. But it no longer has the same traction as original Toshiba, nor is it available on most online ecommerce stores.

      • rvba 2 days ago

        Can someone with a background in marketing explain why so many companies drop their recognizable brand names and start using other names, which have lower market penetation?

        My theory is either mergers/spinoffs.. or someone in markering wants to get a bonus, so they kill the brand.

        Someone at blizzard wanted to rename battle.net to blizzard app for example

        • simne a day ago

          Cause could be regulations, or some bigger brand strict rules.

          For example, anti-monopoly regulators looking very formal, when somebody appear to share more than 30% of market, then just turn on prohibitively high taxes.

          Other example, Nintendo limited number of games published from one brand, so big companies have created daughter companies who formally own some of IP.

          • rvba 17 hours ago

            And how is changing the brand name supposed to help with that?

        • ghaff 2 days ago

          There are a lot Of reasons, often not good ones, but it’s often the case that the existing brand doesn’t really represent what the company does any longer or is too narrow.

          Of course it may just be a marketing VP who wants a prestige project on their resume.

  • alliao a day ago

    I'm still in love with Panasonic's Let's Note series, still made in japan and very durable

  • kossTKR 2 days ago

    There was also the Sony Vaio Laptop line that i remember as being quite a good product?

    But yeah they all seemed to die out.

    • hinkley 2 days ago

      Fujitsu lifebook. Smallest laptop with a removable CDROM drive.

    • alliao a day ago

      vaio brand got bought and reincarnated with very uninspiring lineup

  • alliao a day ago

    probably get resurrected by some Taiwanese laptop oem/odms

  • lotsofpulp 2 days ago

    For me, the MacBook Air happened.

    My progression was Toshiba Satellite in 1999 and 2002, then HP business line laptop in 2005 and 2008, and then once I tried MacBook Air and I could close the lid without worrying the laptop would stay on and overheat in my backpack, I never went back to Windows. Plus it was lightweight and the battery was the most long lasting.

    I am guessing businesses using Windows used mostly HP/Dell/Lenovo business line computers because they had extensive same day/next day on site support, so the business Windows market went to them, and lots of personal laptop market went to MacBook Air, leaving an untenable smaller, low margin market for non HP/Dell/Lenovo companies.

    And then some of that went to smartphones/tablets too. And after SSDs hit the market, technological advancement slowed so a new laptop was not that much better than an old laptop, so you could keep using it for 5+ years easy. I typed this on a 2015 Air.

    • rustcleaner 2 days ago

      I am not picking on you in particular but want to bring a general point up to readers:

      Why is it I always read the parent opinion when it comes to laptop shopping, but never the opposite (which is mine):

      -Bigger -> better

      -Thicker -> better

      -Terrible battery life

      -Desktop CPU and GPU

      -Blow torch thermal exhaust

      -Maximum storage and RAM

      All so I can run lm-studio, having Llama 3 pump out my edgy 4chan shitposts and effortposts for me, within the absolute comfiness of Qubes OS. I couldn't imagine relying on something thin and low powered unless it did very specific things (and, if it can't run Qubes, then it stays offline).

      • kbelder 11 hours ago

        I absolutely agree. There are such clear benefits to thicker and heavier laptops. Drawbacks as well, obviously, but it is crazy how lopsided the market seems to be in favor of compactness, even at the cost of power, battery, and reliability.

        Same applies to phones.

      • lproven a day ago

        I agree. I am typing on a Macbook Air but I don't like it much. Too thin, terrible keyboard, terrible buttonless trackpad. It has almost no ports and no expansion. There's no easy way to middle-click without addons.

        The only reason I use it is its battery life. Work provided it; it's not mine and I'd not choose one myself.

        For my own, I want a big thick chunky thing with 3 physical mouse buttons, lots of ports, a swappable (and ideally hot swappable) battery and ideally 2 of them.

        There's a space for thin'n'light, but there are next to no commercial options for those who want fat'n'flexible.

      • pezezin a day ago

        You want a gaming laptop.

      • lotsofpulp 2 days ago

        Because the magnitude of demand similar to yours is so low, that I doubt many manufacturers can exist profitably in that space, hence Toshiba and others exiting the market.

      • fragmede 2 days ago

        Because what you've described isn't a laptop but a shitty desktop computer. Laptops that large are heavy, which sucks to travel with. Such a monstrosity isn't convenient to throw into your bag and bring with you everywhere all day. If you're not even using it for graphics, get a desktop, install Tailscale, and remote in for your shitposts.

        Is the opinion, anyway. Nothing wrong with your opinion (or mine), they're just opinions after all, but if I was on my feet all day in the field and needed a laptop, I'd rather something small and light with a long battery life. If I wanted something luggable, I'd go for a usb-c monitor and as small a desktop chassis as would fit a proper GPU inside of a mobile one. But dealing with the cables for that would get old real quick, so a laptop like you describe would be my pick in those circumstances. Eg my friend's a VJ and performs at random clubs and a gaming laptop like you described is the most appropriate device.

    • underlipton 2 days ago

      Having worked at an electronics retailer, I can tell you that "lots" is doing some heavy-lifting in terms of characterizing how much of the laptop market is taken up by Macbooks, let alone the Air, specifically. Certainly, many college students wouldn't be caught dead with a Windows computer, but most people were happy to walk out with something half the price of a Mac, particularly when they're far more used to Windows at school and work than OS X. Acer and Asus were beginning to take up a decent portion of our sales by the time I left.

      • lotsofpulp 2 days ago

        MacBook Airs are cheap, in my opinion. I spent $900 on mine 9 years ago. The amortized cost over its lifetime is very competitive with even the cheapest Windows laptops.

        Even now, an M3 13in with 16GB ram and 512GB SSD is only $1,300. An M3 13in with 8GB ram and 256GB SSD is $900, and that would be good enough for 90% of people for 5+ years easy.

        https://www.costco.com/macbook-air-laptop-13-inch---apple-m3...

        • underlipton 2 days ago

          And (what to most consumers seems like) an equivalent Windows machine can be had for less than half that. After 2.5 years, you can upgrade. This argument has been had ad nauseum; it's not interesting anymore. I just wanted to point out what seemed like a mistake in your analysis re: sales. If you'd like to discuss that, that's fine.

          • lotsofpulp 21 hours ago

            >After 2.5 years, you can upgrade.

            This is a huge drawback for most people. I don’t want to “upgrade”, and spend time and effort transferring to new devices. I want the device I have to work for as long as possible.

            • underlipton 21 hours ago

              Again, from my experience selling these things: "most people" don't want to be stuck with a laptop with 8gb of RAM soldered to the board. They wait until the cost to repair is more than the price of a new laptop, or for when the time for a repair would be too long, or until their hard drive is full (these people don't bother transferring if they're not getting a device with a higher capacity drive). Then, they buy the cheapest laptop that fits their power needs/brand loyalty and that seems like it'll last 2-4 years. Your wants are atypical.

              • lotsofpulp 15 hours ago

                I think the exiting of basically all laptop makers that used to sell non soldered RAM proves what the market wants.

                People just want stuff to work. They don’t want to think about how it works. I am using 4GB RAM today on a 2015 MacBook Air, and it works acceptably for excel, editing PDF, emails, maps, watching video, YouTube, web browsing, chatting, video calling, and pretty much everything else that the vast majority of people do.

                An 8GB M3 laptop today will easily last 5+ years for most people’s needs. And they don’t even have to worry about what RAM is or does.

BoingBoomTschak 2 days ago

How about the US and MS ruining everything as usual?

"In April 1989 the Office of the U.S. Trade Representative issued a preliminary report accusing BTRON of being a trade barrier, as it only functioned in Japan, and asked the Japanese government not to make it standard in schools. TRON was included along with rice, semiconductors, and telecommunications equipment in a list of items targeted by Super-301 (complete stop of import based on section 301 of the Omnibus Trade and Competitiveness Act of 1988). It was removed from the list after the USTR inspection team visited the TRON Association in May. In June the Japanese government expressed their regret at U.S. intervention but accepted this request not to make it standard in schools, thus ending the BTRON project. Callon opines that the project had nevertheless run into such difficulties that the U.S. intervention allowed the government to save face from cancelling the project.

According to a report from The Wall Street Journal, in 1989 US officials feared that TRON could undercut American dominance in computers, but that in the end PC software and chips based on the TRON technology proved no match for Windows and Intel's processors as a global standard. In the 1980s Microsoft had at least once lobbied Washington about TRON until backing off, but Ken Sakamura himself believed Microsoft wasn't the impetus behind the Super-301 listing in 1989. Known for his off the cuff remarks, in 2004 governor of Tokyo Shintaro Ishihara mentioned in his column post concerning international trade policy that TRON was dropped because Carla Anderson Hills had threatened Ryutaro Hashimoto over it."

https://en.wikipedia.org/wiki/TRON_Project

  • Dalewyn 2 days ago

    No; Japan would have eventually stumbled and fell into obscurity even without any American interference. It's a running joke at this point, because Japanese cannot compromise quality to make costs reasonable.

    In an environment where the world produces goods 80~90% as good as Japanese ones for 1/4th~1/8th the cost, who in their right mind buys Japanese? This is what happened to Japan's entire electronics industry, home appliance industry, and more. Japan still has a domestic computer industry by a technicality (it's all Made In China) primarily fended over by Hitachi and Panasonic, but most westerners likely won't know because they simply don't sell overseas.

    I'm not going to bother getting into how Japanese are horrible with software too.

    • hakfoo 2 days ago

      The cost/quality thing doesn't mean they couldn't have maintained a "halo product" line that steered the platform. It could have been like the early 1990s PC market: government with a 'buy domestic' mandate or budget-no-object buyers might have bought a PS/2 Model 80 or Deskpro 386, but the masses, especially overseas, would buy a white-box 386DX made of Taiwanese parts.

      There's an interesting contrast: while Japan produced MSX-- a clear example that a multi-vendor standard can be wildly successful-- they missed the idea of a clone ecosystem for their heavier-duty professional machines. Why weren't there vendors cranking out clones of the PC-98, FM Towns, or X68000? Did they require more propriatery special sauce than an IBM 5150, or was there a cultural/market difficulty that would have caused them to flop on the market?

      OTOH, perhaps part of the problem was that the features Japan needed had poor cost/benefit ratios outside of the CJK market: you either have to make the enhanced video stuff optional, reducing platform standardization, or charge people for a feature they don't see as immediately beneficial.

    • justsomehnguy 2 days ago

      > It's a running joke at this point, because Japanese cannot compromise quality to make costs reasonable.

      An old anecdote:

      A company places an order to produce some gizmo through a Japanese firm. The order states what is could be "up to three non-working gizmos per one hundred in the final shipment". The order is completed, the employee comes to get it, sees a big packed boxes and a small one. He asks "what's in the big boxes?" and receives "That's your one hundred gizmos, like you ordered". He asks again "But what is in this small box there?" and receives "That's your three non-working gizmos, like you ordered".

    • pezezin 2 days ago

      > It's a running joke at this point, because Japanese cannot compromise quality to make costs reasonable.

      As someone living in Japan, I don't buy this argument. There are plenty of shoddy Japanese products, number one being the houses (but I guess that is not something that you can export).

      > I'm not going to bother getting into how Japanese are horrible with software too.

      Here I fully agree. Anybody who has to suffer the software and web services made in here knows how bad it can be.

      • endemic 8 hours ago

        Funny that Japanese _game_ software is brilliant.

    • hedora 2 days ago

      My switch and midrange sony tv (which just turned ten years old, and is still “good enough”) seem like decent counterexamples.

    • BoingBoomTschak 2 days ago

      Cars and game consoles kinda disagree with you. For example, Mazda (now exclusively built in Japan) is a fair compromise between quality and cost.

      About software, I don't know. They seem to be very decent as far as research and embedded goes, even a bit of open source (I can cite https://github.com/guicho271828, https://github.com/fukamachi and https://github.com/cxxxr on the top of my head), but consumer software appears to be another story. I don't think it's for this reason, though.

      I can even remember them having a strong affinity with Prolog... ah, there's the link: https://en.wikipedia.org/wiki/Fifth_Generation_Computer_Syst...

  • TacticalCoder 2 days ago

    > How about the US and MS ruining everything as usual?

    My personal theory is that MS set the human race back at least 20 years. YMMV.

    • agumonkey 2 days ago

      How would your alternate timeline look, genuinely curious. Sometimes I fear that the current one looks mediocre but if you took another path it would have been worse for random reasons.

      • ghaff 2 days ago

        Given the independent software vendor model, you were going to end up with a very small number of mass market operating systems/platforms one way or the other. It was certainly happening in the large commercial computer system space. In fact, the somewhat surprise is that Microsoft didn't become even more dominant. A lot of people certainly expected it to.

RajT88 2 days ago

As an aside my recent trip to Japan, I hit up all the crazy gaming stores hoping to find an FM Towns or the even more rare FM Towns Marty.

They looked at me like I was a three headed monkey.

  • mappu 2 days ago

    I looked around too - Mandarake only had popular consoles; Super Potato in Akihabara and Retro TV Game Revival in Osaka had MSX, but no FM Towns. The store clerk read my enquiry off Google Translate on my phone and gave me a one-word reply: いいえ.

    It probably takes local expertise to find one in someone's attic. Playing its Lupin III exclusive game might have to remain a MAME job.

    • pezezin 2 days ago

      If you are looking for retrocomputers in Akihabara, the place to visit is BEEP. It is the only shops that specializes in that kind of stuff.

  • Tiktaalik 2 days ago

    I did see some old PCs of that sort at Hard-off further afield. I don't recall where. May have been in the burbs around Osaka.

    In Tokyo the easily accessible source to have a peek at some computers like this is BEEP in Akihabara.

    Edit: Yep looking through my photos I saw an FM Towns Marty at a Hard Off in Kanazawa. ¥49500.

    Saw an actual FM Towns at a Hard off in Hachioji. More ¥77000 when you combine the monitor and computer.

    That last big Eco Town in Hachioji was quite the old PC source. Also had a PC 98 and boxed (!) X68000.

    • RajT88 2 days ago

      > ¥49500

      At current exchange rate, that is a steal! Kanazawa is... Sadly not typically a place you go if you're on a brief visit of a couple weeks.

      When I was studying abroad in 2001, I saw an original FM Towns tower case basically in the junk pile. I wanted it even then for how distinctive it was, but had no clue what it was. What might have been!

  • Findecanor 2 days ago

    I think a better chance of finding anything vintage would be Yahoo! Auctions and if you're not living there: a proxy-shipping service such as Buyee.jp.

    BTW. My Holy Grails as a vintage keyboard collector are ergonomic keyboards with columnar layout for the PC88 and PC98 computers ... or a B-TRON keyboard.

pnw 2 days ago

Some of the Japanese PC platforms were sold in Australia. Shout-out to the Hitachi Peach, a curious 6809 machine reminiscent of an Apple II. My high school friends father purchased one (it was cheaper than an actual PC) and we spent many weekends coding on it but it was very difficult to find much info beyond the manual. IIRC it had Microsoft Basic but a proprietary OS. Parts of the manual were still in Japanese. It all seemed so exotic at the time.

JSR_FDED 2 days ago

I remember working in Taiwan in the late nineties and noticed that the PE2 editor for MSDOS was incredibly popular, because you could easily define macros that mapped to multi-character sequences, which combined with a BIOS that had character tables for simplified Chinese meant you could generate Chinese text without too much difficulty. To this day my vimrc has some of those PE2 macros :-)

Razengan 2 days ago

In a YouTube video about the history of OS/2, I learned that ole Microsoft back in the 1990s before their mob tactics were curtailed, used to send suited thugs to Japanese PC manufacturers to chastise them for even offering users the mere option of buying PCs with OS/2 instead of Windows..

Could such practices have stifled the innovation and growth within the Japanese PC industry?

They did have some takes of their own on the PC platform with that unique Japanese flavor, in series like the PC-8800/PC-98, FM Towns, etc.:

https://en.wikipedia.org/wiki/PC-8800_series

https://en.wikipedia.org/wiki/PC-98

https://en.wikipedia.org/wiki/FM_Towns

Who knows what more they could've done if Windows 95 hadn't smothered everything under the sun? Like the ill-deserved demise of the Commodore Amiga, this seems like a failure of politics than merit.

  • FMecha 2 days ago

    >In a YouTube video about the history of OS/2, I learned that ole Microsoft back in the 1990s before their mob tactics were curtailed, used to send suited thugs to Japanese PC manufacturers to chastise them for even offering users the mere option of buying PCs with OS/2 instead of Windows..

    That, out of context, gives an impression-implication of Microsoft's Japanese operations having ties with the yakuza.

    And for PC-98 and FM Towns, versions of Windows were available for them (with FM Towns' uniqueness diminishing gradually as Fujitsu embraced DOS/V).

tantalor 2 days ago

This would be much more comprehensible if the author would include some dates.

amelius 2 days ago

> The thing about DOS is that it’s a much thinner OS than what we think of in 2024. When you’re writing DOS software of any kind of complexity, you’re talking straight to the hardware, or to drivers that are specific to particular classes of hardware.

How is that different from how we access our GPUs nowadays?

  • bitmasher9 2 days ago

    Today we access GPUs through standard APIs like DirectX, CUDA, OpenGL etc. In DOS there wasn’t these standards, and you had to write code for specific hardware.

    • DaoVeles a day ago

      This is why DOS games that supported 3DFX just talked directly to the GPU. They documented all the registers mapping for direct access. Didn't even need to install drivers.

      DOS was in its twilight years at that point, Nvidia/ATI didn't bother with DOS support so they just kept them behind OpenGL/DirectX.

    • amelius 2 days ago

      True, but our OS has no idea what processes run on which GPU, or how much GPU memory they use. Everything on the GPU happens behind the back of the OS.

      • TazeTSchnitzel 2 days ago

        That's not true, the kernel driver knows what page mappings belong to particular processes, including GPU page mappings. Moreover, you have no choice but to talk to that kernel driver; you can't go behind its back and talk truly directly to the GPU, even if you bypass the userspace GPU driver, because this would allow circumventing memory protection. It is true, however, that modern GPU kernel drivers are relatively thin.

0xbadcafebee 3 days ago

I never thought about this before, but product competition is basically evolution in action. Entities with more desirable traits that adapt better to a given ecosystem survive, the rest don't. (In addition to things like a pre-existing dominant species having advantages over new ones)

(fwiw, Windows won out because it had better business strategy. Apple wanted to be in everyone's homes; Microsoft wanted to be in everyone's business. One of those is easier to sell to in bulk, and easier to charge more money. In addition, Windows being more hardware-agnostic, and encouraging an ecosystem of competing hardware manufacturers, allowed them to invest less in hardware themselves, while creating an industry that would vie for business on Microsoft's behalf. This is of course different than the "workstation" market of uber-high-powered individual computers, which sort-of still exists, though with PC hardware)

  • acdha 2 days ago

    Windows also won by parasitizing a previously bigger host (Bill Gates’ mother was on IBM’s board), and shutting out competition by forcing vendors not to offer other companies’ software if they wanted Microsoft licenses at better than retail pricing.

    • nine_k 2 days ago

      This is how MS-DOS and early Windows won. But the first version of MS Excel was written for MacOS. And it's MSO what's holding businesses on Windows, not the other way around.

      • orionblastar 2 days ago

        MSO doesn't work in WINE at least the latest versions don't.

        • smegger001 2 days ago

          I recall being told that office uses undocumented Windows APIs making it harder for groups like WINE and Proton to support them. whether or not thats the intent or a happy accident well...

    • canucker2016 2 days ago

      Bill Gates' mother, Mary Gates, was not an IBM board member.

      She was on the national United Way's executive committee. Also an executive committee member was IBM's Chairman, John Opel.

      see https://www.nytimes.com/1994/06/11/obituaries/mary-gates-64-...

      [edit]

      also Windows OEMs always got lower than retail price for Windows licenses (assuming your volume sold was high enough)

      from https://en.wikipedia.org/wiki/Bundling_of_Microsoft_Windows#... :

      ====

      Microsoft once assessed license fees based on the number of computers an OEM sold, regardless of whether a Windows license was included. Beginning in 1983, Microsoft sold MS-DOS licenses to OEMs on an individually negotiated basis. The contracts required OEMs to purchase a number of MS-DOS licenses equal to or greater than the number of computers sold, with the result of zero marginal cost for OEMs to include MS-DOS. Installing an operating system other than MS-DOS would effectively require double payment of operating system royalties. Also, Microsoft penalized OEMs that installed alternative operating systems by making their license terms less favorable. Microsoft entered into a consent decree in 1994 that barred them from conditioning the availability of Windows licenses or varying their prices based on whether OEMs distributed other operating systems.

      ...

      In 2009, Microsoft stated that it has always charged OEMs about $50 for a Windows license on a $1,000 computer.

      ====

      • acdha 2 days ago

        You’re right about the boards - it’s been a while but the main point was that this wasn’t just pure open competition for their biggest break.

        My focus on the licensing was this part which your quote included: “Microsoft penalized OEMs that installed alternative operating systems by making their license terms less favorable”. The consent degree and other legal cases took a while to apply any effective counter pressure, and by that point Microsoft had managed to effectively starve competitors (DR-DOS, GEOS, BeOS, OS/2, etc.) of revenue which would have made the 80s and 90s marketplace more competitive. They knew that staying the default choice for businesses as long as possible meant that those companies would acquire a library of software and training which only worked for their operating systems, and successfully banked on a slow government response.

  • makeitdouble 2 days ago

    We're talking about international trade, so trade barriers and foreign policies play a huge role that go beyond simple competition.

    Japan had a hell of a time to deal with car exports in the US, SONY still won but the reaction to it was also extremely strong.

    A Japanese company had absolutely no chance to go hit the US market and displace companies like SUN, Apple or IBM on the US soil. In particular software IP is a whole lot harder to fight for than manufacturing IP (even if it gets stolen, it's moot if it can't be physically applied. In software land reproduction is a given)

    Europe wasn't much an easier target either. We saw that later with docomo failing miserably while having a clearly superior product.

    In reverse the US has a long history of opening the Japanese market when they really want to, and forgoing that market isn't critical either, so there's quite an asymmetry.

    The only ground Japanese company learned to properly fight has been video games so far.

    • tihwih2o28092 2 days ago

      > The only ground Japanese company learned to properly fight has been video games so far.

      Nintendo yes; Sony PS division is more or less run out of SIE in San Mateo.

      • FMecha 2 days ago

        And the rest is nowadays mostly gacha developers, to which South Korea and China are also eating up their lunch, even at home.

    • mcdow 2 days ago

      Could you speak more to the US “opening the Japanese market when they really want to”? I’m not familiar with this.

      • makeitdouble 2 days ago

        A bit far from the technology related fields, but the beef import agreements are the most explicit on this part [0]. US beef was found in clear violation of the safety rules multiple times, but Japan had to bend anyway ("The government has put priority on the political schedule between the two countries, not on food safety or human health.").

        Japan is typically not doing great in its relationship with Korea or China, or even India, nor the EU really (France and Germany are closely friendly, but won't give much economic benefits), so the US have a pretty strong leverage when it comes to negociations.

        [0] https://en.wikipedia.org/wiki/United_States_beef_imports_in_...

        PS: there's a mountain of other reasons now, but Japan and Russia were also not doing great as they've been fighting over the northern islands for decades. To my eyes they really really suck at international relations in general.

      • cherryteastain 2 days ago

        i.e. "open up your market to us and stay silent about our tariffs on your products, or else"

  • hedora 2 days ago

    I first figured this out when looking at filesystems. Ext4 is a generation behind netapp wafl (from the 1990s). ZFS is arguably competitive with some of the enterprise filers from back when sun existed.

    There have been at least four generations of commercial filesystems since then. So, a randomly chosen on-prem filer will be 3-5 generations ahead of Linux.

    The reason is economic: In that space, companies have a half life of 5 years, but influential open source filesystems last 20+ years.

    • ghaff 2 days ago

      ZFS would almost certainly be more widely adopted were it under a different license. But Red Hat, for one, wasn't going to touch it given CDDL.

  • bane 2 days ago

    Yes! And it's very interesting to consider two additional things:

    1. how seemingly "less capable" technologies win out in this evolutionary environment

    2. how plentiful VC (and to some extent government funding for R&D) distorts normal "evolutionary" forces in a market

    • 0134340 2 days ago

      1. In that case those that were adopted tended to be the cheaper and more ubiquitous technologies, ie, at a biological level just more calorically cheaper to adopt and perhaps efficient to maintain.

      2. VC and general funding, ie supporting an entity, is a feature of evolution.

      I guess I expected better of HN but it seems people don't realize that nothing we can do will stop evolution and everything we do is just a feature of it.

    • 0xbadcafebee 2 days ago

      Yeah; I think it's our ego, or "common sense", that makes us think that the most advanced thing will win out in the end. In reality it's the most well-adapted thing that wins out in the long term. If you ever see a thing and think "this is really advanced, this must be the future", think again.

      I'm not sure I agree that VC, government, etc distorts evolution in a democratic republic with a capitalist economy. Evolution still happens within that ecosystem. The economy is the climate, and the laws, politics, people, culture are the landscape. That ecosystem includes VCs along with all the other things. Within that environment, the best adapted thing survives. Sadly, that "thing" (a corporation, for example) may make poor decisions that lead to its demise, the way a snow leopard leaping for a goat on a mountain side may make them both tumble to their doom. Nature's a cruel mistress.

      I could even see something like Soviet Russia being an ecosystem that evolution still happens in. It's a very extreme environment, to be sure, but evolution still happens in the most extreme environments on earth.

      • bane 2 days ago

        > I'm not sure I agree that VC, government, etc distorts evolution in a democratic republic with a capitalist economy. Evolution still happens within that ecosystem.

        I'm definitely not claiming that evolution doesn't occur, but that the temporary supply of capital that VCs (and similar sources) supply creates an artificial environment that creates distorted evolutionary pressures. When the VC money runs out, and technologies are "returned to a natural ecosystem", the end result is often not that the VC funded tech was the fittest in the market. Meanwhile, if a particular technology consumed the market by distorting its fitness function and eliminates otherwise healthy competitors along the way, the VC backed tech might survive in the end, but it's not clear that it would be the fittest given natural market forces. However, everybody else is dead.

        To really stretch this analogy, it's a bit like an environment with several predatory cat species - each good at a particular strategy. One day a team of hunters comes in, kills all the prey, and selects a specific cat species as "the champion" and simply feeds it in captivity. Once the rest of the cat species die off, they release the champion into the wild where lack of competition stalls its own further evolution. What cat would have been the "best" in the given ecosystem? Nobody knows.

        There's a few interesting cases around this. Palantir comes to mind as a particularly useful case. They entered a market with a dozen competitors all surviving in the market, flooded it with something like 13 or 14 rounds of fundraising's worth of product, nearly annihilated everybody else in the market and when finally released to the wild, found that their signature product line was no longer what the market wanted at the size of the market the VCs had assumed. Today the company more or less acts as a bespoke web application company. Mimic competitors like C3.ai are also suffering under similar fates.

        For a government funded use-case look no further than the space launch market. For decades it was funded by dumb government money and regulations that gave the appearance of a competitive market but was intended more to ensure strategic options. Those companies evolved/optimized to capture this steady supply of crippled prey money but it really created a bunch of sick, inbred, organizations. SpaceX appears, acts like a hungry wild tiger in a room full of Pugs and proceeds to dismantle them all, while building new markets and capturing the easy prey dripping out of the government funnel. There's no magic physics to SpaceX, all of the competencies existed in the industrial pipeline, but the government funding mechanism evolved them in unnatural ways. SpaceX, being heavily government funded, may yet evolve into one of those, but for the moment it's the difference between a Wolf and a domesticated small apartment dog.

  • 0134340 2 days ago

    And entities that become too great and harmful to other entities (monopolistic) get challenged, even if they can provide some good, and from that challenge they sometimes get parasitized as well. Business competition, therefore human behavior, is natural no matter what way you want to politicize it.

fnord77 2 days ago

sadly, no pics. The FM Towns was kinda cool looking

permo-w 3 days ago

is "PC platform" the standard term here? I'm not saying it isn't, it just sounds a little odd to me.

  • bane 2 days ago

    Yes, it stands for "Personal Computer Platform".

nisten 2 days ago

they got old

Apocryphon 2 days ago

Also worth watching: Why is Japan So Weak in Software? by Asianometry

https://www.youtube.com/watch?v=ky1nGQhHTso

  • canucker2016 2 days ago

    Reminds me of a Japanese software company I applied and interviewed at when I graduated university. Company name? Bug Software.

    A quick internet search shows no relevant results for the company.

    • ripcity512 2 days ago

      Was this the place?

      bug.co.jp

      Apparently they changed their name last year.

  • terminalgravity 2 days ago

    I wish there was a TL;DW bot to summarize a videos like this. I’m curious but not in a place i could easily watch a video.

    • chmod775 2 days ago

      Asianometry's videos are good precisely because of the detail and background he goes into. If you summarize them you take that away and pretty much just end up with what has already been said here.

      • immibis 2 days ago

        I tend to bookmark Asianometry videos to watch later because they seem very informative but I'm rarely in the mood to actually listen to the very dry documentary style. They don't make good background noise for instance - I need to focus to accept the new information. At the same time they don't naturally attract my focus.

      • Dalewyn 2 days ago

        It's a 20 minute long video, the information density can almost certainly be denser.

        • chmod775 2 days ago

          > It's a 20 minute long video

          You mean 20 minutes short. There's enough in there to blow it up into a 45 minute documentary at least. You already spent more than 20 minutes commenting under this story.

          > The information density can almost certainly be denser.

          And what would be the point of that? There's a limited amount of information one can retain in a short span of time, and it's not like he repeats himself or has a verbose style.

          I already go back and rewatch his videos later, taking new pieces of information from them.

          Again, if you want the tldw, it's already in the comments here. If you want the details, go watch the video.

          The video is being linked because the video itself is good. Wanting a summary that retains the same qualities is like wanting to have your cake and eat it too.

          • Xelbair 2 days ago

            Video by itself is less information dense than text. 20 mintute video could've been a 2-5min read essay.

            Not to mention the convinience of being able to easily re-read parts of it, and find reveland information instead of seeking the video.

          • Dalewyn 2 days ago

            I generally read faster than some narrator slowly babbling on over a meandering script, so that is 20 minutes long. If the video is 20 minutes long, I wager I can read an equivalent article in less than 5 minutes and come out enlightened all the same.

            Videos are great for getting the eyes of the general man who doesn't have a preconceived interest in a subject, you're trying to bait clicks and videos are great for that. For people already interested in the subject though? Videos are almost always a literal waste of time compared to a well written article.

            And if you wanna say I have a short attention span: Sue me. I'm a 35 year old millenial, we're infamous for having short attention spans.

            • nottorp 2 days ago

              > If the video is 20 minutes long, I wager I can read an equivalent article in less than 5 minutes and come out enlightened all the same.

              Not to mention that skimming through this page of HN comments does NOT take 20 minutes. More like 45 seconds.

              > And if you wanna say I have a short attention span

              Short life. Not attention span. If you get your info in writing you waste 4x as little of your life getting it.

            • SllX 2 days ago

              You my friend may benefit from developing the arts of the 2x speed, the skipping, the scrubbing and the stopping.

              Not every video is worth watching to completion (some are, you get a feel for it), there may be background details you want to skip or scrub through eyeballing the thumbnails depending on familiarity with the subject matter and sometimes everything you want to know is right at the end of the video in a neat little summary. The comments can even give you some insight into where the video is going and whether you want to continue if you read through some of the top ones during playback.

              I’m not much younger than you, but watching and re-pacing YouTube for educational/information videos is a skill that can be refined and the visual imagery can provide details that again, depending on what it is, might be missed in a written summary. And hey, if none of this is for you, maybe this comment helps someone else out.

              • bigstrat2003 2 days ago

                I mean... you can do that, yes. Or we could use the far superior medium of text, where you don't need hacks to get around how slow it is.

                • bdw5204 2 days ago

                  One reasonable compromise would be for video makers to provide a transcript or written article to complement their video. Video is a terrible format especially when you're actually using the video and not just using it as a mechanism to deliver audio. Audio is not a bad medium because you can do something else while listening to it.

                • SllX 2 days ago

                  I mean you could restrict yourself to only a single medium, independent of what the rest of the world is doing; or you can learn to process information efficiently regardless of medium and respect each medium for its own strengths and weaknesses. A good YouTube video produced perfectly needs none of the “hacks” I listed above and will relay far more information on complex subject matter in context than just an essay will, but people are more comfortable writing will write and people who want to make videos will make videos.

                  • lotsofpulp 2 days ago

                    There is a slight conflict of interest where more money can be earned by wasting the information recipient’s time via advertising. Text offers less opportunity to do this.

                    Perhaps some amount of time wastage is necessary to incentivize the information providers to provide the information, but the pendulum can also swing too far.

                    • SllX 2 days ago

                      That’s why I got good at getting through videos quickly and figuring out when or if they’re a waste of time.

                      There’s plenty of “research” videos that are just spewing crap that can be found on a wiki or a database somewhere else on the web; but see enough of them and you pick up on the pattern and cadence and quality they’re produced at quickly enough to just move on when you see it.

                  • ddingus 2 days ago

                    This too is the way.

                    I am quite happy to take good info produced for me in almost any form.

                    We all have options.

              • ddingus 2 days ago

                This is the way, along with just listening during other tasks.

            • ddingus 2 days ago

              Same. Reading is always faster than watching video.

              However, listening to one can be done while driving, or doing many other tasks.

              Expecting producers to cater to the can read fast crowd is not realistic. People are just not going to produce for us. And I do not believe they should.

              There are options. A big one is listening.

              • smegger001 2 days ago

                especially now with you tubers vainly trying to placate the fickle algorithm gods by stretching out videos to meet time minimum lengths

                • ddingus 2 days ago

                  Yeah. Can't blame 'em for that. It sucks, and I bet most of them hate doing it.

        • rowanG077 2 days ago

          I don't understand this argument. What has video length to do with whether it can be denser? This is like looking at a 1gb file and saying it could certainly be smaller.

          • ddingus 2 days ago

            The commenter believes the video should take less time and contain a higher percentage of strictly factual information.

            A text analogy might be a recipe written in simple style, steps, ingredients, etc. and one you might find on a food blog where there is an intro about their childhood, how Nana was the best and along the way, somewhere in there one might learn how to prepare the food.

            In this case, the video producer made pretty good choices about info density and content length.

            The commenter disagrees and here we are chatting about all that.

        • ddingus 2 days ago

          Nope. That producer packs it in solid. Yes, it could be more dense, but at the expense of it being watchable by most people.

          This is a case of just because one can does not mean one should.

          Having an audience matters. It matters more than optimal info density does. Besides, just watch it at 2x. With this producer doing that is challenging. Pay attention!

          :)

    • littlecranky67 2 days ago

      There is kagi (paid search engine) summarizer for youtube videos: https://kagi.com/summarizer

      As for the above link, it gives:

      "Japan has a large trade deficit in software, importing far more software and services than it exports. Despite having iconic hardware companies, Japan lacks major software giants like Microsoft or Oracle. This is due to a history of government policies that favored hardware over software development, as well as a shortage of skilled software engineers and a lack of software startups in Japan. While Japan has made efforts to develop domestic software platforms, they have largely failed to gain traction. The video suggests there are no easy solutions to Japan's software industry challenges."

    • skissane 2 days ago

      I didn't watch the video but I skim read the YouTube transcript.

      The video doesn't propose any single explanation, just a series of events all of which arguably set back Japan's indigenous software industry. A few of the incidents it mentions include (my summary below is more based on my own knowledge of the topic, than what the video specifically mentions):

      Fujitsu and Hitachi cloned IBM mainframes. So did lots of other companies. At the time they started doing it, IBM was (intentionally) releasing their software into the public domain. However, in 1969, IBM announced they'd start copyrighting their software. Initially they still released the core OS (primarily MVS) into the public domain, and only copyrighted add-ons. However, as the 1970s progressed, more and more new functionality went into the copyrighted add-ons, while the public domain core received only limited enhancements. Finally, in the early 1980s, they put the whole OS under copyright. This left Fujitsu and Hitachi in a difficult position. They were used to getting their mainframe OS from IBM for free, and suddenly they couldn't legally do that any more. Legal choices for them would have included: (1) fork IBM's operating system and create new enhancements themselves (either clone IBM's copyrighted enhancements by clean-room engineering, or design their own incompatible enhancements), (2) negotiate with IBM for a license (unclear if IBM would agree, and may have cost $$$), (3) license an alternative operating system (e.g. UNIX), (4) build their own OS from scratch. But none of those options appealed to them (or maybe they tried some and it wasn't working out), so they decide to go with option (5): illegally copy IBM's copyrighted mainframe operating systems. They used the fact that IBM still shipped the source code for much of its copyrighted software to customers, and somehow got customers to (illegally) hand that source code over. They made rather trivial changes to the source code to try to hide the copying–for example, Fujitsu renamed a lot of IBM routines whose name started with the letter I, to start with the letter J instead. They searched and replaced IBM copyright notices with their own. They even bribed IBM employees to give them IBM confidential material (the IBM employees accepted the bribes as part of an FBI sting operation). And IBM found out, and sued both Fujitsu and Hitachi, and the settlement of the suit required Fujitsu and Hitachi to pay IBM hundreds of millions of dollars, and also banned Fujitsu and Hitachi from continuing to sell the software outside Japan (IBM agreed to let them continue selling it in Japan, in exchange for them paying licensing fees.)

      Other stuff I know about this topic (not in the video): In the 1980s and early 1990s, Fujitsu mainframes were quite popular in Australia, but due to this settlement, by the end of the 1990s, basically all of Fujitsu's Australian mainframe customers had either migrated to IBM mainframes, or else to non-mainframe platforms. There are still Fujitsu and Hitachi mainframes running in Japan today, but they are deeply legacy, basically stuck in the 1990s – they didn't follow IBM's transition to 64-bit in 2001. Fujitsu and Hitachi weren't the only mainframe vendors faced with this problem, but other vendors sought to solve it within the confines of the law. In the US, Amdahl had the same issue, but it decided to focus on their Unix variant UTS instead of MVS. (Amdahl did have an internal project to build a clone of IBM's MVS, apparently based on legal clean-room reverse engineering, called Aspen, but it got caught in development hell, and Amdahl cancelled it before they ever officially shipped it, although possibly a few customers got beta test versions.) Germany's Nixdorf had a fork of IBM's DOS/VS operating system (for low-end mainframes), which they got by acquiring the American company TCSC; they ported the Unix clone Coherent to run on top of it, before killing it off in the late 1980s when Nixdorf decided to give up on mainframes and focus purely on Unix instead. Other mainframe vendors didn't have this problem because their operating systems were not based on IBM's – for example, the other Japanese mainframe vendor, NEC, their mainframes run a fork of GE/Honeywell/Bull's GCOS operating system (ACOS), which NEC legally licensed.

      Another incident the video discusses is the TRON project, which was a Japanese indigenous standard for operating system APIs, endorsed by the Japanese government, conceptually similar to POSIX. It included both variants aimed at general purpose computing (BTRON) and embedded systems (ITRON). However, this frightened the US software industry, which convinced the US government to declare TRON a "trade barrier". And that mostly killed TRON as an operating system. TRON didn't die completely, it still sees some use in embedded systems even today (the video mentions the Nintendo Switch Joy-Con controllers run it), but it never achieved the original vision of becoming Japan's standard operating system. Instead, Microsoft Windows did.

      And then there were also macroeconomic issues (Japan's real estate crisis in the 1990s), and cultural issues – it mentions how the Japanese government encouraged Japanese industry to focus on copying successful Western technologies, even improving them incrementally in the process, as opposed to coming up with fundamentally novel technologies of their own. That approach served Japan very well for industries such as cars, but doesn't work so well for the software industry.

      • zzo38computer 2 days ago

        I had tried to figure out some of the details of TRON but some are difficult to find due to being Japanese and/or some files seems to be missing.

        (I think ITRON is still in use, but BTRON and CTRON are not as common these days, as far as I know.)

        There is also FOSS implementation of BTRON called B-Free but it is seems to be incomplete, and as far as I can tell is abandoned. (There is also year 2053 problem, which could be mitigated by using 64-bit timestamps, and some other problems.)

        (I had also had idea of my own operating system design, which also uses TRON character code, as well as other things. This can also be made operating system standard which multiple implementations could be made up, I would hope.)

        • skissane 2 days ago

          The ITRON specs can be downloaded from the TRON website: https://www.tron.org/specifications/

          Most of them are available in English, although a few of the more peripheral specs are Japanese-only.

          At least some of the BTRON and CTRON specs were published in English - http://tronweb.super-nova.co.jp/tronspecs.html - but don’t appear to be available online.

          CTRON appears to have been based on OSI-I see references to FTAM and MOTIS (the X.400 mail transport protocol)-and also advertised support for ISDN as a key feature-which would make it very dated by today’s standards

          I can’t find any references to actual specs for MTRON. I am wondering if it was ever actually specified, or if it was just vapourware

          > (I had also had idea of my own operating system design, which also uses TRON character code,

          You don’t need a whole operating system for that. It could just be a library which supported converting TRON code to other character sets, displaying text in TRON code, etc.

          • zzo38computer a day ago

            > The ITRON specs can be downloaded from the TRON website

            I had seen that, but many things (including many of the older stuff) seems to be missing.

            However, some stuff I had found elsewhere (not from TRON website), and I had been able to partially figure out from the Japanese documentation and had been able to write a program that can partially parse the TRON Application Databus format. However, many things I could not figure out very well.

            I had also found what seems to be some document of TRON instruction set (I have some interest in the instruction sets of some older computers, not only TRON), but is Japanese and also seems that some files are missing, anyways. So, I don't know its working anyways. (I also found some English documentation but it does not actually explain much, although there is a few minor explanation of them.)

            > CTRON appears to have been based on OSI-I see references to FTAM and MOTIS (the X.400 mail transport protocol)

            However, I think X.400 uses ASN.1, and ASN.1 does not have a TRON string data type (I had once also wanted to use this in something else (unrelated to X.400 mail and CTRON), so I used the octet string type instead).

            > I can’t find any references to actual specs for MTRON. I am wondering if it was ever actually specified, or if it was just vapourware

            That was my guess as well, but I don't know either.

            > You don’t need a whole operating system for that. It could just be a library which supported converting TRON code to other character sets, displaying text in TRON code, etc.

            You are right, I do not need a whole operating system for that (see below). But, the operating system design is helpful for many other things. The use of TRON character code is only one of its features; it also has many other features, many of which are difference from POSIX and other systems (although some things are similar to other systems). (I had written elsewhere about my ideas of operating system designs, too.)

            I had done some of the other stuff relating to TRON code in Linux too (although it is incomplete). I have written some programs that can display text, I had made fonts with TRON character set (although not all planes are implemented), and some programs that can convert some character codes (including e.g. EUC-JP, EUC-CN, EUC-KR). I had also been able to write partial English documentation from what I could figure out (which I documented on Just Solve The File Format Problem Wiki), although much of it is difficult.

            (One of the problems I have is the way the GT fonts are coded; they are several TrueType fonts, that use an improper Unicode mapping that does not seem to have anything to do with the actual Unicode characters that those numbers are supposed to correspond to (except font 1, which does correspond correctly to Unicode), and the mapping of improper Unicode into TRON code is given in a large PDF file, and the mapping seems to not have any sort of reasonable order, and that I could not figure out how to handle automatically. If I could figure out how to handle it properly then I could implement a file that can use them with the TRON code directly; someone who is Japanese and is able to compare the characters to figure out how to make up your own bitmap fonts with the GT character set, could do that too, I suppose.)

            • skissane a day ago

              The main selling point of the TRON character code, from what I understand, is CJKV speakers who disagree with Han unification.

              But it sounds like you don’t know Japanese. Do you know another CJKV language? If not, what makes the TRON character code attractive to you?

              Personally TRON interests me simply because it is an OS API which looks rather different from POSIX, and I’m interested in learning about other ways of doing things - just maybe some of those other ways of doing things contain some good ideas. But the TRON character code doesn’t really, since as a non-CJKV speaker, the debate about Han unification has no practical relevance, but rejecting that is the main selling point of the TRON encoding.

              • zzo38computer a day ago

                > But it sounds like you don’t know Japanese.

                Actually, I do know a little bit, but not very well. (I still don't like Han unification though. Also, that is not the only problem with Unicode (and some other character sets) anyways. Furthermore, I also think that one character set will not be suitable for all purposes anyways (and that it is not possible to make it so), so I also have interest to allow additional character sets to be available for use.)

                > Personally TRON interests me simply because it is an OS API which looks rather different from POSIX, and I’m interested in learning about other ways of doing things - just maybe some of those other ways of doing things contain some good ideas.

                I am also interested in it for this reason, too. (I also have interest to learn of some old Japanese computer systems, too.) (As you could see, I did mention stuff other than TRON character code, too)

                • skissane 2 hours ago

                  > Furthermore, I also think that one character set will not be suitable for all purposes anyways (and that it is not possible to make it so), so I also have interest to allow additional character sets to be available for use.)

                  There are a number of libraries in existence for converting between character sets. ICU is one of the most famous but it is Unicode-centric (it supports many character sets but wants to use Unicode as a lowest common denominator.) But older libraries such as iconv or recode lack ICU’s Unicode-centricity, so might be more appealing to you. Have you thought about doing something like contributing TRON support to recode?

                  https://github.com/rrthomas/recode - I always called it “GNU recode”, but https://www.gnu.org/software/recode/ says it isn’t a GNU package, even though for a long time the GNU website hosted it. (I think maybe it is “ex-GNU”: the GNU project maintainer retired, and the new maintainer who took over wasn’t willing to abide by the GNU project’s policies.)

      • formerly_proven 2 days ago

        Siemens did real mainframes and their mainframe OS BS2000 is still around, it's just part of Fujitsu, Nixdorf appears in that story as well because that's how the Siemens mainframe division ended up at FSC (Siemens acquires Nixdorf, folds its mainframe division into that, then splits it up into the ATM business and sells the rest to Fujitsu).

        • skissane 2 days ago

          Nixdorf shut down their mainframe business in 1989, and sold the remnants to Comparex (which started out as a Siemens-BASF joint venture, but Siemens withdrew around the same time as Comparex acquired Nixdorf's mainframe business). So when Siemens and Nixdorf merged in 1990, Siemens did not acquire Nixdorf's mainframe business, only Nixdorf's other product lines (Unix systems, ATMs, etc). But Siemens still had their own mainframe business. Comparex already sold IBM-compatible mainframes, so they didn't continue Nixdorf's mainframes as an independent hardware line, they were primarily buying the support contracts and the customer base.

          Siemens mainframes and Nixdorf mainframes had significant differences:

          Siemens BS2000 mainframes were derived from RCA Spectra 70. Their ISA was mostly IBM-compatible in user mode (problem state), but significantly different in kernel mode (supervisor state), and their operating system was completely incompatible–the BS2000 operating system was derived from RCA TSOS. RCA sold their mainframe business to Sperry, who then merged with Burroughs to form Unisys. The RCA Spectra mainframes became Unisys' Series 90 mainframe line, and RCA TSOS was renamed to Unisys VS/9. But by the 1980s or early 1990s, the RCA-derived Unisys mainframe line was dead. Whereas, their Sperry and Burroughs heritage mainframe lines (Unisys OS 2200 and Unisys MCP) survive today, although now they are software emulators running on x86-64 servers instead of physical hardware. RCA Spectra/TSOS only survives today in the BS2000 branch, save that Siemens ended up selling it to Fujitsu.

          By contrast, the Nixdorf mainframes were more straight IBM clones, and so aimed for instruction set compatibility both at the user application and operating system level, and could run IBM operating systems. They were mainly used with the low-end IBM DOS/360-derived operating systems rather than the high-end MVS operating system family. Nixdorf faced the same problem that Fujitsu and Hitachi did, of IBM closing their operating systems, but they solved it by buying the American software company TCSC, who maintained their own fork of the IBM mainframe DOS, called Edos, which Nixdorf then renamed NIDOS (Nixdorf DOS). TCSC had started Edos when IBM decided to make new DOS versions available only for S/370, not for older S/360 machines, hence Edos was originally a backport of those newer S/370-only DOS versions to the older S/360 machines. When Nixdorf bought TCSC, they renamed it NCSC. NIDOS ended up offering features that IBM DOS/VSE never had, like a Unix compatibility subsystem (PWS/VSE-AF, derived from Coherent) – much latter, MVS (now z/OS) and VM/CMS (now z/VM) ended up getting one, but DOS/VSE (later z/VSE and now VSE^n since IBM offloaded it to 21CSW) never has.

          Siemens also once had a lower-end mainframe line, which ran an operating system optimised for smaller machines, BS1000. BS1000 was discontinued long ago, and there is little information about it online. There was a BS1000 compatibility subsystem for BS2000, called SIM-BS1000 [0], but I'd be surprised if anyone is still using it today.

          And Siemens also had BS3000 mainframes – like Nixdorf mainframes, these were fully IBM compatible, and designed to be able to run IBM's operating systems – they ran the Siemens BS3000 operating system, which was a rebadging of Fujitsu MSP – Fujitsu stolen version of IBM MVS. Siemens had to enter into a settlement with IBM as a result, although I'm led to believe the terms were relatively lenient on Siemens, who did their best to portray themselves as innocent victims of Fujitsu's dishonesty. But that was the end of BS3000. I think the remnants of the Siemens BS3000 line ended up with Comparex too. Comparex finally shut down their IBM-compatible mainframe business in 2000; they survived as an IT services business until 2019, when they were acquired by SoftwareOne.

          And then in 1999 Siemens transferred their mainframe business to the Fujitsu-Siemens joint venture, and in 2009 Fujitsu bought out Siemens, and hence Fujitsu ended up with Siemens mainframe business.

          And so today Fujitsu has three totally incompatible mainframe lines – their own Fujitsu MSP mainframes (previously sold internationally but now only surviving in Japan), the ex-Siemens BS2000 (primarily surviving in Germany, although a little bit in the UK and a few other European countries), and the VME mainframes they got by buying ICL in 2002 (I believe the UK government is the sole remaining user, they really want to migrate off them but it is just too hard.) Both BS2000 and VME now run under x86-64, while I believe the Japanese line still has proprietary physical hardware.

          [0] https://link.springer.com/chapter/10.1007/978-3-642-67415-0_...

      • rvba 2 days ago

        It makes sense to have one stanfard across the world. This way good software can come from multiple countries.

        But... killing TRON probably helped a lot. Look at consoles - somehow nintendo can come with good software made by good, creative programmers.

        Maybe the soft was killed by trade barriers.. and more trivial things - such as software developer being a poorly paid dead end job in Japan?

        • skissane 2 days ago

          > It makes sense to have one stanfard across the world. This way good software can come from multiple countries.

          TRON was not the only attempt to define a standardised operating system API in the 1980s. As well as TRON and POSIX, another was IEEE Std 855-1990 (Microprocessor Operating System Interface or MOSI for short). But POSIX was the only one which really succeeded.

          MOSI is pretty obscure, but my impression of what happened there – in the early 1980s, 8-bit platforms were widely popular, but very incompatible with each other (e.g. software written for Apple II could not run on Commodore 64 even though they both had 6502 CPUs). So the proposal for a common OS API was made, and an IEEE standards committee started standardising it. But by the time the standard was finished, those 8-bit platforms were declining, and IEEE was left with a standard focused on the needs of a declining market, and so very few ever used it. [0] (MOSI itself isn't inherently 8-bit – like POSIX it is a source-level standard rather than a binary-level standard, so could be used on 16-bit or 32-bit systems – but its feature set was a lowest common denominator of what 8-bit systems supported, so not very attractive for machines that have the memory to do much more.)

          In 1988, the Japanese education ministry decided to make BTRON the standard operating system for Japanese schools. From what I understand, this move frightened Microsoft (among others), who feared that it would prevent DOS/Windows from being used in Japanese schools, or else force Microsoft to add a BTRON compatibility subsystem to their operating systems. So Microsoft lobbied the US government to pressure the Japanese government, and that pressure resulted in the Japanese education ministry dropping the requirement for BTRON, which in turn largely killed BTRON off. It didn't completely die; a variant of BTRON (Cho-Kanji) continues to be developed into this century, but it is a niche product whose primary value proposition is far more comprehensive support for obscure Kanji characters than mainstream Unicode-based operating systems (maybe useful if you do research into historical Japanese texts). Another factor in killing the Japanese education ministry's requirement for BTRON, was domestic opposition from NEC – at the time, NEC PC-98 machines running DOS were the de facto standard in the Japanese education system, and BTRON threatened NEC's dominance of that market. It could well have been a combination of both external pressure from the US government and internal pressure from NEC that killed it.

          Related is Ada Programming Support Environment (APSE) and Common APSE Interface Set (CAIS). Part of the US DOD project which resulted in Ada, whose requirements demanded not only a standard programming language, but also a standard development environment, with APIs for integrating with compilers, editors, version control, build tools, etc. CAIS is standardised in MIL STD-1838A. So it is like POSIX/MOSI/BTRON, a cross-operating system API, albeit one focused on the needs of software development rather than general purpose computing–implementations of CAIS existed for Unix, OpenVMS and MVS, so development tools written against the CAIS API could run on all three operating systems. And the US government poured untold amounts of money into it, but I'm not sure if anyone ever used it. Probably some military projects did.

          And APSE/CAIS in turn inspired PCTE (Portable Common Tool Environment), which was basically the EU's answer to APSE/CAIS. And just like APSE/CAIS, it consumed large quantities of EU research funding, before eventually being forgotten without ever seeing much if any real world use. It is standardised as ISO/IEC 13719–which apparently nobody uses, but ISO keeps on renewing because withdrawing a standard consumes bureaucratic resources, and PCTE is so obscure nobody even wants to expend the effort on withdrawing it.

          [0] There was an implementation of MOSI for CP/M-80 and Pascal-MT+ – you can find it at https://github.com/skissane/MOSI/ – but I doubt that ever saw much use.

    • cglong 2 days ago

      You can ask Gemini to summarize a YouTube video for you! Also if you have YouTube Premium on Android, you can ask questions about the current video.

      Here's Gemini's summary of GP's video: https://g.co/gemini/share/8c0417024a3f

      • hedora 2 days ago

        Alibaba just released 100 large models. One takes a 20 second video and summarizes it.

        Now I wonder if it supports audio. If so, I want the relevant browser plugin so I can read YouTube on my machine!

    • Findecanor 2 days ago

      I tend to just listen to Asianometry as a podcast while resting or doing menial tasks at home.

    • makeitdouble 2 days ago

      You do you, but I'd chime in on why it's not recommended: any simple answer to that question will just be "there's a long history and international context that led to a complex situation".

      That's the perfect TL;DW but I don't think it helps you much.

      20 min is short for such a vague question, and you can watch at 2+x the speed if info density is so paramount.

      To note it still glosses over an incredible amount of critical things, it's just not a topic that can be shortened that much for anyone actually caring about understanding it.

      • nottorp 2 days ago

        > 20 min is short for such a vague question, and you can watch at 2+x the speed if info density is so paramount.

        Interesting how a lot of defenders just assume delivering the same info in writing is not feasible.

        Has the skill to read/write become a competitive advantage again, like in the medieval ages when 0.1% of the population knew how to do it?

        • makeitdouble 2 days ago

          To me it comes down to how the creator decided to publish their piece.

          If there is no specific accessibility need, getting it in the original format on the chose platform would be my primary choice. In particular it's not a time sensitive subject and watch it later sounds easy enough.

          You seem to put reading/writing on a pedestal, but as you point out we're not in the medieval ages anymore, nobody should feel superior because they read it instead of watching it.

          • nottorp 2 days ago

            > You seem to put reading/writing on a pedestal

            No, I put my personal time on a pedestal. Videos are slow, and I'd rather spend my life in other ways.

          • Xelbair 2 days ago

            No, i just value my time and ability to search information.

            plus i'm unable to consume media in the 'background'.

    • ranger_danger 2 days ago

      there are many such sites if you just google for them