Most used programming languages...
Steffen Huber (91) 1953 posts |
I haven’t followed the further development of OS/2 and its successors eComStation and ArcaOS in detail after IBM canned it (and thus made it pretty unlikely that anyone not locked into legacy stuff would continue using it), but my impression was that the last credible Java version was a community port of OpenJDK 6. The last production-ready version was IBM Java 1.4.2. But I am sure you can provide a link where I can get the latest Java 18 Early Access version for OS/2. Not that it would run sensibly on my OS/2 Warp 486 PC I still have somewhere… |
Steffen Huber (91) 1953 posts |
This sentence makes no sense. The JVM is the bytecode engine. The different JVM implementations range from very simple to insanely complex. I guess you meant “bytecode representation” or “bytecode specification” instead?
But in practice, Hotspot and OpenJ9 are very fast, rivalling the speed of native code. And if you really want/need native code – sacrificing a bit of flexibility – the GraalVM Native Image feature covers that nicely. Not mature on all platforms yet, but gets better with every release.
This tendency is there for all languages apart from those that make it very hard to write reasonably complex software. It mostly depends on the developers, not on the language used.
Compared to what? I very rarely see good software. Independent of implementation language.
Completely untrue. Up until the release of OpenJDK and the organization of further development via the JCP, Java was not really open. Nowadays, it is. “Nowadays” as in “for the last 10 years”.
Mostly based on misinformation and/or outdated information it seems. |
mark stephens (181) 125 posts |
There is currently a new Java release every 6 months (with an LTS every 18 months). A lot of the new features are there to help developers write more readable code (ie improvements to Switch statements and function programming). |
jim lesurf (2082) 1438 posts |
I gave up on Java decades ago because they kept on introducing ‘new’ methods, etc, faster than I could learn them! And the ones I’d learned were invariably then ‘deprecated’. |
Steffen Huber (91) 1953 posts |
I am especially interested in your ongoing research wrt the claim that OS/2 has an up-to-date Java implementation available. |
Steffen Huber (91) 1953 posts |
This is strange, because you can still compile and execute (the vast majority of) your Java 1.1 (i.e. from more than 20 years ago) code on the very latest Java version. There are very few “breaking changes” throughout the Java history – mostly accessing JDK-internal classes (hey, your IDE warned you about that!) and when new keywords are introduced. There were a few things that have been removed from the runtime, but you can usually just add them by yourself (JavaFX, JAXB, Nashorn…) to your application instead of relying on the JRE. Of course you have to keep in mind that just because something is deprecated in Java, it will not go away. Only if it is “deprecated for removal”, it gets interesting, like it was the case for Applets, where the API will finally get removed with Java 18ff. When it comes to backwards compatibility, only few languages care as much as Java, while still modernizing the language and the runtime at an astonishing pace. Overall, the Java universe is a pretty backward-compatible thing, and things don’t get changed without proper investigation. With the possible exception of changing the default encoding for property files of course… So when DavidS writes…
…it looks like the same spreading of misinformation as ever. |
Rick Murray (539) 13840 posts |
Well, it’s a better attitude than PHP, and lets just not talk about the Python 2 and Python 3 debacle. |
David Feugey (2125) 2709 posts |
It can be deprecated AND supported. |
Clive Semmens (2335) 3276 posts |
The point of DEPRECATED is that it IS supported – for the present – but probably won’t be after some future release, or may not be on some versions of the architecture that are otherwise compatible. (That was certainly the meaning when I was writing the docs, anyway.) |
Steve Pampling (1551) 8170 posts |
I’d always understood it to mean “still present and working, but if you find a bug: well that’s another reason we deprecated it, aren’t you glad the buggy bit is going?” Sort of similar to "Google fully supports this – this week " |
Clive Semmens (2335) 3276 posts |
That’s very often the unwritten subtext to the official story I regurgitated, certainly. |
Rick Murray (539) 13840 posts |
Funny. My impression of “Deprecated” is the complete opposite. It is not supported, but it is still present. Hence anything that uses something deprecated won’t suddenly blow up, fail, or dump endless errors (PHP, I’m looking at you). Generally speaking, things can be deprecated if something new and better comes along, but they should only be removed if there’s an actual security risk. If the language just decides to reorganise things or change function names in order to improve something, then there should be some sort of aliasing between the older style and the newer style. You know, like how you can still use K&R function definitions in C. Because all the existing code doesn’t rewrite itself. And then there’s Python. Which managed to make a bigger balls up of changing stuff than .Net, and that’s saying something. Perhaps a tacit admission that the original Python was badly designed? I don’t know, but it doesn’t give me too much confidence for Python 5.
Yes, processors are permitted to have a stricter definition of deprecated (“this instruction is no longer officially supported and will be removed soon”) because there are architectural reasons why they are being removed (like SWP). That said, some architectures toss out old unwanted stuff, others don’t. Your shiny liquid nitrogen cooled multicore x86-64 slab still boots up in 16 bit Real Mode and you can (sort of) boot up MS-DOS and the like if you wanted. Native, not under any emulation. The problem is that such old systems don’t understand modern hardware (EFI boot, USB keyboards, SATA, NTFS on ridiculously huge drives, etc etc) and if there’s no BIOS compatibility built it (Intel are starting to remove this) then you would need to develop your own drivers and good luck addressing stuff in a 16 bit world. ;-)
Given Google’s past history, that can count for pretty much anything that isn’t Search, YouTube, GMail, or ad-flinging. https://killedbygoogle.com/ |
Clive Semmens (2335) 3276 posts |
Oh, I spoke only of how ARM defined DEPRECATED up to at least 2007 – and only in the context of the instruction set architecture at that. I don’t recall using it in any other context. Whether other organizations defined it the same way, or at all, I couldn’t say. I wouldn’t even be confident ARM still define it exactly that way; things at ARM have changed a great deal since I left – not, I hasten to add, connected in any way with my retiremnt!
Oh, perhaps in software that’s fair enough. In hardware & computer architecture, there are other valid reasons for removing something – things I was conscious of were reusing instruction set space (think the Never condition in early ARM architectures, but there are other (smaller) examples), power saving, chip real estate saving, and streamlining instruction decode. Keeping the odd bit of old baggage is one thing, but burdening yourself with loads of it (Intel style) is another… |
Rick Murray (539) 13840 posts |
Could explain why the ARM is itty bitty and x86-64 devices are, well… Those heatsinks look like something out of Starfighter 3000. |
Steffen Huber (91) 1953 posts |
Your posts would be a lot more credible if you’d actually start to substantiate your various extraordinary claims you are happy to liberally throw around. Maybe others could then start to see where your own personal “truth” comes from.
OK, now please look up what “deprecated” actually means in Java (JDK, JRE, i.e. the SE platform – obviously not in any old library that might break compatibility any time) world instead of fantasizing about what you THINK the meaning is. Hint: it is completely different to “deprecated forRemoval”. Even better would be if you come up with an actual example where Java changed in a way that it “obsoleted your code”. Going by your comment, there must be a whole lot of examples for that, so that should be very easy. Oh, and don’t forget to answer the OS/2 question. |
Clive Semmens (2335) 3276 posts |
That would be all very well if terms had fixed definitions, but English isn’t generally like that. Thus when complete clarity is required you have to define the terms you use. ARM used to be (and may still be, I’ve not checked) very good about that. “Unsupported”, like “Deprecated”, does not have a very precise meaning in English. You need to check exactly what it means in the document you read it in. I don’t recall using “unsupported” in the ARM ARM, or in the ARM Assembler manuals, so I suspect it’s not a term ARM uses (or used in those days), but it’s possible it’s just a hole in my memory. If we did use it, there’ll be a definition of it in the books. |
Rick Murray (539) 13840 posts | “When I use a word,” Humpty Dumpty said, in rather a scornful tone, “it means just what I choose it to mean—neither more nor less.” |
Kuemmel (439) 384 posts |
I’ll cite the definition in the ARMv8 instructions set document in the glossary => Deprecated – Something that is present in the Arm architecture for backwards compatibility. Whenever possible software must avoid using deprecated features. Features that are deprecated but are not optional are present in current implementations of the Arm architecture, but might not be present, or might be deprecated and OPTIONAL, in future versions of the Arm architecture. You can even find the definition of OPTIONAL there which makes it a bit more complicate or clear ;-) I guess that definition might work also for programming languages… But within the document itself sometimes it says just deprecated or also deprecated due to performance reasons…so still a bit unclear and really seems to depend on context… |
Graeme (8815) 106 posts |
Optional in the ARM definition is due to features of the ARM chip being optional to implement by various manufacturers. A chip can be made with or without NEON for example. Deprecated simply means ‘will be made obsolete without further warning but not yet’. If you program in a language, you should not use deprecated things. The next time you update your compiler, it may not compile because they have moved from deprecated to obsolete. If you write a compiler or interpreter, you must support deprecated items. This way, there is an overlap. A warning for programmers that some things are going to go away whilst they are still working. |
Clive Semmens (2335) 3276 posts |
I think that’s a more recent definition than 2007, isn’t it?* I don’t recall us actually defining OPTIONAL back in my day…although Neon was optional from its beginning, so perhaps I’m just misremembering.
Absolutely so. Lewis Carroll understood the nature of the English language extremely well, and was a real nerd of his time. * Edit: well, it obviously is, since it’s the ARMv8 document. But I think it’s also a bit more detailed than the 2007 ARM ARM’s was. |
David J. Ruck (33) 1635 posts |
Rick wrote (while I was in Cornwall with no internet):
Yes it was an admission that Python 2 was badly designed, particularly in the handling of Unicode. So they had a choice between letting the language slowly die through creping irrelevance, or take the pain of making some incompatible changes to fix it with Python 3. As there is no Python 4 yet, it’s a bit too soon to make assumptions about Python 5. You could say exactly the same thing about ARM and the 26 bit to 32 bit mode breaking changes. If they had stuck with a 26 bit processors restricted to 64MB of program memory, they would be no more than a niche low power micro controller supplier today. Python 4 will probably be backwards compatible with 3 to a greater degree than 2, but it may not. If ARM hadn’t developed the non backwards compatible 64 bit ARMv8 ISA, they wouldn’t be supplanting Intel in data centres, and would probably only be in the budget end of mobile phones now. With both Python and ARM, there comes point where absolute backwards compatibility becomes less important than changing to meet new challenges, while maintaining the same design philosophy and tooling. The alternative is to be replaced by something completely different, which requires even more relearning and porting. |
Rick Murray (539) 13840 posts |
Only in part?
I’m going to disagree. Surprise! 😋 In the late ‘70s, you needed a degree (probably in electronics) in order to use a small “home” computer. They usually came in kit form, had to be assembled (and debugged), and some of the friendlier ones provided a monitor that allowed you to enter a program by bashing in hex codes. We won’t even talk about the price. Accordingly, they weren’t very popular. In the eighties, home computers got a lot friendlier. A little box that plugged into a TV, cassette tapes, and some form of (usually pretty lousy) BASIC. These machines were often quite simple both electronically and in terms of firmware, so one person could understand the operation of the entire machine. Prices were a lot less scary, and in the UK the push via The Computer Literacy Project helped introduce a lot of children to computers. They were very popular as they had potential, and it was a pretty even playing field – a smart teenager could put together a game every bit as good as commercial releases. In the late 80s and 90s, machines sort of lost their way a little. More capable, more powerful, but too often viewed as “things for games”. At the time Amstrad made some machines that were less a computer and more a “business machine”, it would start up with a world processor or spreadsheet… They weren’t terribly good, but Amstrad did figure out that what was putting people off was that oftentimes computers were “too complicated”. I managed to program my mother into understanding that A: was the little slot on the front, C: was something inside, and D: was the CD-ROM. It wasn’t until the mid 90s (with the introduction of Windows 3.1) that computers started to get more popular. It’s pretty easy to malign Windows for a billion technical reasons, but psychologically they made things a lot simpler. The machine could be set to start up Windows automatically, so every subsequent interaction could be pushing the arrow to little pictures and prodding a button to activate that picture. Applications such as Word that, at the time, weren’t that complex but coupled with a laser printer (increasingly less expensive) could output professional looking documents that were a million times better looking than the dot matrix efforts of the previous decade. Therefore, it is extremely clear that the popularity of a machine (and its operating system) has nothing to do with how many people learn to program it. Indeed, when considering the world at large, most people don’t give a flying figlet about programming. They really don’t. Their entire justification, from about the days of DOS 3.3 onwards, is “can this thing do what I want?”. In the late 70s, maybe one computer per town. Fast forward to today, an average modern household may have at least a laptop, maybe a tablet or two, and pretty much everybody has a smartphone (y’all can argue amongst yourselves Android or iOS, because everything else got left behind). In terms of commerce, a successful machine is defined simply as “one that sells”. Cheap to make, easy to sell, makes profit. How many Beebs were sold? Or the AppleII? Or the Atari? Now pick a smartphone manufacturer and a reasonably decent (but not flagship) model. I rest my case. |
Rick Murray (539) 13840 posts |
Here’s a hint – when you’re talking with Brits, expect some degree of sarcasm. It’s just how we are wired up.
It’s how I interpreted your talking about how “well received” a system is. What is well received of not a synonym for popular? Plus – while there are some exceptions – operating systems, computers, even entire companies lived and died depending on their “popularity”. The world’s best system that nobody uses is indistinguishable from a polished turd that nobody uses. What they have in common is that they both, essentially, failed.
True, but I think the commonly accepted definition is “coin comes, not goes”. You might not subscribe to the commerce definition, however a computer that is made in order to be sold is pretty much exactly what commerce is so no other definition makes sense. A company making profit is generally considered good if that allows for development of newer better things.
ßetamax vs VHS, just to demonstrate “it’s the same old story” (everywhere I go, I get slandered, libelled, I hear words I never heard in the Bible….). |
Steve Pampling (1551) 8170 posts |
If you add in the V2000 standard then I think you get a nice progression of quality going one way and availability of content, and hence sales quantities going the other.
Something about that made me think of Trump. :) |
Stuart Swales (8827) 1357 posts |
:-) |