A large motivation for this move is likely to ensure that attempts by some incumbent ISAs to lobby the US government to curb the uptake of RISC-V are stymied.
There appears to be an undercurrent of this sort underway where the soaring popularity of RISC-V in markets such as China is politically ripe for some incumbent ISAs to turn US government opinion against RISC-V, from a general uptake PoV or from the PoV of introducing laborious procedural delays in the uptake.
Turning the ISA into an ISO standard helps curb such attempts.
Ethernet, although not directly relevant, is a similar example. You can't lobby the US government to outright ban or generally slow the adoption of Ethernet because it's so much of a universal phenomenon by virtue of it being a standard.
Then, there's NASA, and their rad hard HPSC RISC-V. It's a product now, with a Microchip part number (PIC64-HPSC1000-RH) and a second source (SiFive, apparently.) I suppose it's conceivable the a Berkeley CA developed ISA that has been officially adopted as new rad hard avionics CPU platform by the US government's primary aerospace arm could get voted off the island in some timeline, but it's looking fairly improbable at this point.
Only time will tell if it ends like: "to avoid someone else shooting us, let's shoot ourselves".
Dedicated consortiums like CNCF, USB Implementers Forum, Alliance for Open Media, IETF, etc are more qualified at moving a standard forward, than ISO or government bodies.
> There appears to be an undercurrent of this sort underway where the soaring popularity of RISC-V in markets such as China is politically ripe for some incumbent ISAs to turn US government opinion against RISC-V, from a general uptake PoV or from the PoV of introducing laborious procedural delays in the uptake.
> Turning the ISA into an ISO standard helps curb such attempts.
Why do you think that would help? I fail to see how that would help.
I'd wish they'd write a test suite or certification program instead.. Those ISO standard documents are nowadays better parseable with a chatbot, but they are still the wrong language for the job.
> The RISC-V ISA is already an industry standard and the next step is impartial recognition from a trusted international organization.
I'm confused. Isn't RISC-V International itself a trusted international organization? It's hard to see how an organization that standardizes screws and plugs could possibly be qualified to develop ISAs.
ISO defines standards for much more than bolts and plugs. A few examples include: the C++ ISO standard, IT security standards and workplace safety standards, and that’s a small subset of what they do.
They develop a well defined standard, not the technologies mentioned in the standard. So yes, they’re qualified.
It is certainly an example of why SC22 is a bad idea
The "C++ Standards Committee" is Working Group #21 of Sub Committee #22, of the Joint Technical Committee #1 between ISO and the IEC.
It is completely the wrong shape of organization for this work, a large unwieldy bureaucracy created so that sovereign entities could somehow agree things, this works pretty well for ISO 216 (the A-series paper sizes) and while it isn't very productive for something like ISO 26262 (safety) it can't do much harm. For the deeply technical work of a programming language it's hopeless.
The IETF shows a much better way to develop standards for technology.
Titanic is not an example of why building ships has to be avoided. C++ is a great example, yes, of the damage ambitious and egotistical personas can inflict when cooperation is necessary.
> It's hard to see how an organization that standardizes screws and plugs could possibly be qualified to develop ISAs.
you my friend have not delved into the rabbithole that is standardisation organizations.
ISO and IEC goes so far beyond bolts and screws it's frankly dizzying how faar reaching their fingers are in our society.
As for why, the top comment explained it well; There is a movement to block Risk-v adoption in the US for some geopolitical shenanigans. A standardisation with a trusted authority may help.
Not only that, it might turn RISC-V from a specification freely available under a FOSS license into a proprietary standard that you have to pay 285 CHF (~$350) to buy a non-transferable license for.
> “International standards have a special status,” says Phil Wennblom, Chair of ISO/IEC JTC 1. “Even though RISC-V is already globally recognized, once something becomes an ISO/IEC standard, it’s even more widely accepted. Countries around the world place strong emphasis on international standards as the basis for their national standards. It’s a significant tailwind when it comes to market access.”
Says that, but I don't agree with that. If anything it would have been less successful being picked up in discount markets if the specs weren't free for download, and I don't know what fringes they're trying to break into but probably none of them care whether the spec is ISO.
That can depend on how the spec gets made into an ISO standard. There is a process called "harvesting" that can allow the original author to continue to distribute an existing specification independently of ISO.
Usual lies. There are a plethora of largely ignored international standards. Making it an international standard is just one of many ways to achieve the wide worldwide acception and still has a high failure rate.
Government agencies like to take standards off the shelf whenever they can. Citing something overseen by an apolitical, non-profit organization avoids conflicts of interest (relative to the alternatives).
That’s the definition of throwing the baby out with the bath water.
Is ISO as an organisation imperfect sometimes (as in the docs case) sure?, it’s composed of humans who are generally flawed creatures, is it generally a good solution despite that?, also sure.
They’ve published tens of thousands off standards over 70 plus years that are deeply important to multiple industries so disregarding them because Microsoft co-opted them once 20 odd years ago seems unreasonable to me.
Office Open XML, the standard behind .docx and other zipped XML formats, was fast-tracked into the international standard without many rounds of reviews (by the same JTC 1!).
My take is that it could help tie up fragmentation. RISC-V has different profiles defining what instructions come with for different use cases like a general purpose OS, and enshrining them as an ISO standard would give the entire industry a rallying point.
Without these profiles, we are stuck with memorizing a word soup of RV64GCBV_Zicntr_Zihpm_etc all means
fossilised is often desirable or requested in some industries. Developing for the embedded market myself, we often have to stick to C99 to ensure compatibility with whatever ancient compiler a costumer or even chipset vendor may still be running.
I wouldn't say it never had a problem, but the profiles are definitely a reasonable solution.
However even with profiles there are optional extensions and a lot of undefined behaviour (sometimes deliberately, sometimes because the spec is just not especially well written).
The FUD keeps being brought up, but the solution here was in place before the potential issue could manifest.
It started with G, later retroactively named RVA20 (with a minor extra extension that nobody ever skipped implementing), then RVA22 and now RVA23. All application processor implementations out there conform to a profile, and so do the relevant Linux distributions.
Of course, in embedded systems where the vendor controls the full stack, the freedom of micromanaging which extensions to implement as well as the freedom to add custom extensions is actual value.
The original architects of the ISA knew what they were doing.
Governments seem to care about "self-sufficiency" a lot more these days, especially after what's happening in both China and the US right now.
If the choice is between an architecture owned, patented and managed by a single company domiciled in a foreign country, versus one which is an international standard and has multiple competing vendors, the latter suddenly seems a lot more attractive.
Price and performance don't matter that much. Governments are a lot less price-sensitive than consumers (and even businesses), they're willing to spend money to achieve their goals.
This is exactly what makes this such an interesting development. Standardization is part of the process of the CPU industry becoming a mature industry not dependent on the whims of individual companies. Boring, yes, but also stable.
Yes, and they're both massively debated and criticised, to the point that the industry developed Risk-V in the firstplace. Not to mention the rugpull licensing ARM pulled a few years back.
>On May 5, 1993, Sun Microsystems announced Windows Application Binary Interface (WABI), a product to run Windows software on Unix, and the Public Windows Interface (PWI) initiative, an effort to standardize a subset of the popular 16-bit Windows APIs.
>In February 1994, the PWI Specification Committee sent a draft specification to X/Open—who rejected it in March, after being threatened by Microsoft's assertion of intellectual property rights (IPR) over the Windows APIs
It ticks a checkbox. That's it. Some organizations and/or governments might have rules that emphasize using international standards, and this might help with it.
I just hope it's going to be a "throw it over the fence and standardize" type of a deal, where the actual standardization process will still be outside of ISO (the ISO process is not very good - not my words, just ask the members of the C++ committee) and the text of the standard will be freely licensed and available to everyone (ISO paywalls its standards).
It would be very cool to run the compiled code developed in an ISO/IEC-standardized language on an ISO/IEC-standardized CPU. It might even be standard-compliant.
They're excited about putting the spec behind a notoriously closed paywall??
Us older nerds will remember how Microsoft corrupted the entire ISO standardization process to ram down the Office Open XML (.docx/.xlsx/etc) unto the world.
The original Office ISO standard was 6000+ pages and basically declared unreproducible outside of Microsoft themselves.
There is an entire Wikipedia article dedicated to the kafkaesque byzantine nightmare that was that standardization. [0]
I don't understand why they want to put the RISC-V spec behind the ISO paywall. It will just complicate the access to the standardized version to confirm compliance with it.
Are there any promising core designs yet? Multi-core designs? Any promising extensions being standardized?
I really want to believe, but I don't think we'll see anything like an M5 chip anytime soon simply because there's so little investment from the bigger players.
Yeah Rivos apparently taped out a high performance server class core (probably only a test chip I'd guess) before Meta bought them.
There are plenty of multi core designs (that's easy) but they aren't very fast.
In terms of open source XiangShan is the most advanced as far as I know. It's fairly high performance out-of-order.
I don't think there's anything M5-level and probably won't be for a while (it took ARM decades so it's not a failing). I doubt we'll see any serious RISC-V laptops because there probably isn't demand (maybe Chromebooks though?). More likely to see phones and servers because Android is supporting RISC-V, and servers run Linux.
In terms of extensions I think it's pretty much all there. Probably it needs some kind of extension to make x86 emulation fast, like Apple did. The biggest extension I know of that isn't ratified is the P packed SIMD one but I don't know if there's much demand for that outside of DSPs.
I wonder why. Marketing? ISO tax mandatory to access some specific markets? That said, they should be careful on what they will pay in order to get an ISO stamp. And what parts of RISC-V will be covered... because RVA may probably get significant changes (after a while it may drop some hardware requirements which are kind of only here to help port from legacy ISA to RISC-V). Not to mention, it seems there are doubts about the core memory reservation over ZACAS and only designers of large and performant RISC-V implementations could answer that, and maybe this is a fluke.
It weirdly feels too early.
ISO is often the source of feature creep in programming languages or massive bloat (mechanically favoring some vendors) in file formats. Namely, everything from ISO must be looked at in the details to see if it is 'clean'.
People with absolutely no technical clue who only know "ISO 9001" equate "ISO" with quality initiatives and certifications.
What people with a better clue sometimes wrongly equate ISO with is interoperability.
ISO standards can help somewhat. If you have ISO RISC V, then you can analyze a piece of code and know, is this strictly ISO RISV code, or is it using vendor extensions.
If an architecture is controlled by a vendor, or a consortium, we still know analogous things: like does the program conform to some version of the ISA document from the vendor/consortium.
That vendor has a lot of power to take it in new directions though without getting anyone else to sign off.
While the sentiment is a bit harsh, the performance gap noted is real. RISC-V has a ways to go to catch up to ARM64 and then finally AMD64 but if the Apple M1 taught us anything, it's possible.
RISC-V has always been an ivory tower, with a lot of bad decisions they double down on. Not surprised they're rushing towards this outdated stamp of authority too.
No overflow/carry flag impacting safe overflow checking and bignum performance, the whole conditional move history and backpeddling and state of Zicond, the system for describing feature support is needlessly complicated and just a mess for users outside of embedded, the spec is written more like an academic paper than a CPU manual, vector instructions act like they're written for a coprocessor for some reason, bad frame pointer ABI support, etc.
A large motivation for this move is likely to ensure that attempts by some incumbent ISAs to lobby the US government to curb the uptake of RISC-V are stymied.
There appears to be an undercurrent of this sort underway where the soaring popularity of RISC-V in markets such as China is politically ripe for some incumbent ISAs to turn US government opinion against RISC-V, from a general uptake PoV or from the PoV of introducing laborious procedural delays in the uptake.
Turning the ISA into an ISO standard helps curb such attempts.
Ethernet, although not directly relevant, is a similar example. You can't lobby the US government to outright ban or generally slow the adoption of Ethernet because it's so much of a universal phenomenon by virtue of it being a standard.
Then, there's NASA, and their rad hard HPSC RISC-V. It's a product now, with a Microchip part number (PIC64-HPSC1000-RH) and a second source (SiFive, apparently.) I suppose it's conceivable the a Berkeley CA developed ISA that has been officially adopted as new rad hard avionics CPU platform by the US government's primary aerospace arm could get voted off the island in some timeline, but it's looking fairly improbable at this point.
But yeah, the ISO standard doesn't hurt.
Only time will tell if it ends like: "to avoid someone else shooting us, let's shoot ourselves".
Dedicated consortiums like CNCF, USB Implementers Forum, Alliance for Open Media, IETF, etc are more qualified at moving a standard forward, than ISO or government bodies.
> There appears to be an undercurrent of this sort underway where the soaring popularity of RISC-V in markets such as China is politically ripe for some incumbent ISAs to turn US government opinion against RISC-V, from a general uptake PoV or from the PoV of introducing laborious procedural delays in the uptake.
> Turning the ISA into an ISO standard helps curb such attempts.
Why do you think that would help? I fail to see how that would help.
I'd wish they'd write a test suite or certification program instead.. Those ISO standard documents are nowadays better parseable with a chatbot, but they are still the wrong language for the job.
Test suites: https://github.com/riscv-software-src/riscv-tests
Formal model: https://github.com/riscv/sail-riscv
> The RISC-V ISA is already an industry standard and the next step is impartial recognition from a trusted international organization.
I'm confused. Isn't RISC-V International itself a trusted international organization? It's hard to see how an organization that standardizes screws and plugs could possibly be qualified to develop ISAs.
ISO defines standards for much more than bolts and plugs. A few examples include: the C++ ISO standard, IT security standards and workplace safety standards, and that’s a small subset of what they do.
They develop a well defined standard, not the technologies mentioned in the standard. So yes, they’re qualified.
But isn't RISC-V just a standard? ISO will decide what is RISC-V and what isn't. Then its complicated process will become an obstacle to innovation.
C++ "standard" sounds more like an example of why technology should avoid standards
It is certainly an example of why SC22 is a bad idea
The "C++ Standards Committee" is Working Group #21 of Sub Committee #22, of the Joint Technical Committee #1 between ISO and the IEC.
It is completely the wrong shape of organization for this work, a large unwieldy bureaucracy created so that sovereign entities could somehow agree things, this works pretty well for ISO 216 (the A-series paper sizes) and while it isn't very productive for something like ISO 26262 (safety) it can't do much harm. For the deeply technical work of a programming language it's hopeless.
The IETF shows a much better way to develop standards for technology.
Titanic is not an example of why building ships has to be avoided. C++ is a great example, yes, of the damage ambitious and egotistical personas can inflict when cooperation is necessary.
Say what you will about C++, but it is undoubtedly one of the most successful and influential programming languages in history.
By which metric?
C, Java, Rust, JS, C# do exist
If we are taking cheap potshots, there's a standard for standards: https://xkcd.com/927/ or in the proposed XKCD URI form xkcd://927
> It's hard to see how an organization that standardizes screws and plugs could possibly be qualified to develop ISAs.
you my friend have not delved into the rabbithole that is standardisation organizations.
ISO and IEC goes so far beyond bolts and screws it's frankly dizzying how faar reaching their fingers are in our society.
As for why, the top comment explained it well; There is a movement to block Risk-v adoption in the US for some geopolitical shenanigans. A standardisation with a trusted authority may help.
Not sure if this is a good idea given how ISO has been going for programming languages.
Yeah. I think the ISO process would likely slow down the development of the ISA.
Not only that, it might turn RISC-V from a specification freely available under a FOSS license into a proprietary standard that you have to pay 285 CHF (~$350) to buy a non-transferable license for.
ah, yes. OPEN like science or AI
What's the advantage of standardizing through ISO/IEC? Better adoption in industry?
Seems like this would take away a lot of power from RISC-V International. But I don't know much about this process.
As the article says:
> “International standards have a special status,” says Phil Wennblom, Chair of ISO/IEC JTC 1. “Even though RISC-V is already globally recognized, once something becomes an ISO/IEC standard, it’s even more widely accepted. Countries around the world place strong emphasis on international standards as the basis for their national standards. It’s a significant tailwind when it comes to market access.”
Says that, but I don't agree with that. If anything it would have been less successful being picked up in discount markets if the specs weren't free for download, and I don't know what fringes they're trying to break into but probably none of them care whether the spec is ISO.
That can depend on how the spec gets made into an ISO standard. There is a process called "harvesting" that can allow the original author to continue to distribute an existing specification independently of ISO.
Usual lies. There are a plethora of largely ignored international standards. Making it an international standard is just one of many ways to achieve the wide worldwide acception and still has a high failure rate.
Government agencies like to take standards off the shelf whenever they can. Citing something overseen by an apolitical, non-profit organization avoids conflicts of interest (relative to the alternatives).
Random example I found at a glance: NIST recommending use of a specific ISO standard in domains not formally covered by a regulatory body: https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.S...
It's impossible to take ISO seriously after the .docx fiasco.
That’s the definition of throwing the baby out with the bath water.
Is ISO as an organisation imperfect sometimes (as in the docs case) sure?, it’s composed of humans who are generally flawed creatures, is it generally a good solution despite that?, also sure.
They’ve published tens of thousands off standards over 70 plus years that are deeply important to multiple industries so disregarding them because Microsoft co-opted them once 20 odd years ago seems unreasonable to me.
What .docx fiasco?
Office Open XML, the standard behind .docx and other zipped XML formats, was fast-tracked into the international standard without many rounds of reviews (by the same JTC 1!).
My take is that it could help tie up fragmentation. RISC-V has different profiles defining what instructions come with for different use cases like a general purpose OS, and enshrining them as an ISO standard would give the entire industry a rallying point.
Without these profiles, we are stuck with memorizing a word soup of RV64GCBV_Zicntr_Zihpm_etc all means
riscv was already gaining a profile mechanism outside of ISO, for example 'RVA23' is a known set of extensions
Hardly, see programming languages standards and compiler specific extensions.
languages are more fluid than processor architectures. I don't think they can be compared.
One would think, yet welcome to enterprise consulting, especially customers whose main business is not selling software.
You will find fossilized languages all over the place.
fossilised is often desirable or requested in some industries. Developing for the embedded market myself, we often have to stick to C99 to ensure compatibility with whatever ancient compiler a costumer or even chipset vendor may still be running.
RISC-V never had a fragmentation problem, thanks to the profiles.
I wouldn't say it never had a problem, but the profiles are definitely a reasonable solution.
However even with profiles there are optional extensions and a lot of undefined behaviour (sometimes deliberately, sometimes because the spec is just not especially well written).
The FUD keeps being brought up, but the solution here was in place before the potential issue could manifest.
It started with G, later retroactively named RVA20 (with a minor extra extension that nobody ever skipped implementing), then RVA22 and now RVA23. All application processor implementations out there conform to a profile, and so do the relevant Linux distributions.
Of course, in embedded systems where the vendor controls the full stack, the freedom of micromanaging which extensions to implement as well as the freedom to add custom extensions is actual value.
The original architects of the ISA knew what they were doing.
Maybe it helps get government contracts
“We’re standards compliant”
It's not like ARM and x86 are standardised by ISO either.
Governments seem to care about "self-sufficiency" a lot more these days, especially after what's happening in both China and the US right now.
If the choice is between an architecture owned, patented and managed by a single company domiciled in a foreign country, versus one which is an international standard and has multiple competing vendors, the latter suddenly seems a lot more attractive.
Price and performance don't matter that much. Governments are a lot less price-sensitive than consumers (and even businesses), they're willing to spend money to achieve their goals.
This is exactly what makes this such an interesting development. Standardization is part of the process of the CPU industry becoming a mature industry not dependent on the whims of individual companies. Boring, yes, but also stable.
Yes, and they're both massively debated and criticised, to the point that the industry developed Risk-V in the firstplace. Not to mention the rugpull licensing ARM pulled a few years back.
Yes, but if 30 years ago ARM had an ISO standard they could point to, that would have probably helped with government adoption?
(It's still a trade-off, because standards also cost community time and effort.)
Relatedly, 30 years ago someone attempted to turn the Windows 3.1 API into an ISO standard:
https://en.wikipedia.org/wiki/Application_Programming_Interf...
It didn't become one, but it did become standardised as ECMA-234:
https://ecma-international.org/publications-and-standards/st...
Well, Wine shows that Win32 is the only stable ABI, even on Linux.
>On May 5, 1993, Sun Microsystems announced Windows Application Binary Interface (WABI), a product to run Windows software on Unix, and the Public Windows Interface (PWI) initiative, an effort to standardize a subset of the popular 16-bit Windows APIs.
>In February 1994, the PWI Specification Committee sent a draft specification to X/Open—who rejected it in March, after being threatened by Microsoft's assertion of intellectual property rights (IPR) over the Windows APIs
Looks like that's what it was.
they are de-facto…
It ticks a checkbox. That's it. Some organizations and/or governments might have rules that emphasize using international standards, and this might help with it.
I just hope it's going to be a "throw it over the fence and standardize" type of a deal, where the actual standardization process will still be outside of ISO (the ISO process is not very good - not my words, just ask the members of the C++ committee) and the text of the standard will be freely licensed and available to everyone (ISO paywalls its standards).
> the ISO process is not very good - not my words, just ask the members of the C++ committee
Casual reminder that they ousted one of the founders of MPEG for daring to question the patent mess around H.265 (paraphrasing, a lot, of course)
This allows RISC-V international to propose their standards as ISO/IEC standards.
It would be very cool to run the compiled code developed in an ISO/IEC-standardized language on an ISO/IEC-standardized CPU. It might even be standard-compliant.
They're excited about putting the spec behind a notoriously closed paywall??
Us older nerds will remember how Microsoft corrupted the entire ISO standardization process to ram down the Office Open XML (.docx/.xlsx/etc) unto the world.
The original Office ISO standard was 6000+ pages and basically declared unreproducible outside of Microsoft themselves.
There is an entire Wikipedia article dedicated to the kafkaesque byzantine nightmare that was that standardization. [0]
ISO def lacks luster, and maybe even relevance.
[O] https://en.wikipedia.org/wiki/Standardization_of_Office_Open...
I don't understand why they want to put the RISC-V spec behind the ISO paywall. It will just complicate the access to the standardized version to confirm compliance with it.
Why ISO? Why not somewhere that will allow people to read the standard for free?
Are there any promising core designs yet? Multi-core designs? Any promising extensions being standardized?
I really want to believe, but I don't think we'll see anything like an M5 chip anytime soon simply because there's so little investment from the bigger players.
Yeah Rivos apparently taped out a high performance server class core (probably only a test chip I'd guess) before Meta bought them.
There are plenty of multi core designs (that's easy) but they aren't very fast.
In terms of open source XiangShan is the most advanced as far as I know. It's fairly high performance out-of-order.
I don't think there's anything M5-level and probably won't be for a while (it took ARM decades so it's not a failing). I doubt we'll see any serious RISC-V laptops because there probably isn't demand (maybe Chromebooks though?). More likely to see phones and servers because Android is supporting RISC-V, and servers run Linux.
In terms of extensions I think it's pretty much all there. Probably it needs some kind of extension to make x86 emulation fast, like Apple did. The biggest extension I know of that isn't ratified is the P packed SIMD one but I don't know if there's much demand for that outside of DSPs.
Tenstorrent has announced Ascalon development boards TBA 2026Q2.
That's not gonna beat the M5, but it should be similar or better relative to M1, and a huge performance jump for RISC-V.
I wonder why. Marketing? ISO tax mandatory to access some specific markets? That said, they should be careful on what they will pay in order to get an ISO stamp. And what parts of RISC-V will be covered... because RVA may probably get significant changes (after a while it may drop some hardware requirements which are kind of only here to help port from legacy ISA to RISC-V). Not to mention, it seems there are doubts about the core memory reservation over ZACAS and only designers of large and performant RISC-V implementations could answer that, and maybe this is a fluke.
It weirdly feels too early.
ISO is often the source of feature creep in programming languages or massive bloat (mechanically favoring some vendors) in file formats. Namely, everything from ISO must be looked at in the details to see if it is 'clean'.
busywork ... but maybe good marketing - people somehow believe that ISO has some relationship to quality.
People with absolutely no technical clue who only know "ISO 9001" equate "ISO" with quality initiatives and certifications.
What people with a better clue sometimes wrongly equate ISO with is interoperability.
ISO standards can help somewhat. If you have ISO RISC V, then you can analyze a piece of code and know, is this strictly ISO RISV code, or is it using vendor extensions.
If an architecture is controlled by a vendor, or a consortium, we still know analogous things: like does the program conform to some version of the ISA document from the vendor/consortium.
That vendor has a lot of power to take it in new directions though without getting anyone else to sign off.
> is this strictly ISO RISV code, or is it using vendor extensions
I doubt it - the ISO standard will still allow custom extensions.
A standard 64bit+DSP RISC-V would go a long way for undoing the fragmentation damage caused by the "design by committee" implications.
..it was the same mistake that made ARM6 worse/more-complex than modern ARM7/8/9. =3
As if we have never seen design-by-committee damage coming from ISO?
Have you heard of this C++ thing? :)
> Have you heard of this C++ thing?
The STL was good, but Boost proved a phenomena...
https://en.wikipedia.org/wiki/Second-system_effect
ISO standards are often just a sign Process-people are in control =3
Good marketing, this could open up more large investment into RISC-V.
Be honest, what does RISC-V offer that 10 year old AArch64 doesn't already provide?
RISC-V is still too green, and fragmented-standards always look like a clown car of liabilities to Business people. =3
While the sentiment is a bit harsh, the performance gap noted is real. RISC-V has a ways to go to catch up to ARM64 and then finally AMD64 but if the Apple M1 taught us anything, it's possible.
What does <open source anything> offer that trusty old <proprietary burden> doesn't already provide?
Less legal risk, ARM has grown litigious and wants a bigger piece of the pie.
IP costs real money, and consumers usually don't care how people split up their pies.
100% of a small pie is worth far less than a slice from a large pie. I've met people that made that logical error, and it usually doesn't end well. =3
RISC-V has always been an ivory tower, with a lot of bad decisions they double down on. Not surprised they're rushing towards this outdated stamp of authority too.
>bad decisions they double down on.
Could you elaborate?
No overflow/carry flag impacting safe overflow checking and bignum performance, the whole conditional move history and backpeddling and state of Zicond, the system for describing feature support is needlessly complicated and just a mess for users outside of embedded, the spec is written more like an academic paper than a CPU manual, vector instructions act like they're written for a coprocessor for some reason, bad frame pointer ABI support, etc.