I almost never wait in huge lines for anything. I camped out once for football tickets in college. Once. I also once waited six hours for an iPhone 4 when it first came out. It was my first smart phone and I had been putting off getting one way too long. That was it though. Yet I know people who have waited in ever decreasing lines for each iteration of the iPhone. The reduced lines are definitely part of the sizzle wearing off and the iPhone being just another smart phone. Yet even at 8 pm last night there was a line for iPhones outside our local Apple store. It didn’t wrap around the mall like in the iPhone 4 days but the end of the first day still having a line for an iPhone 8 was pretty telling to me.
It was just a few months ago when Ubuntu announced they were killing off Unity, their main desktop option. Many people were wondering if this was part of their larger pivot towards more profitable ventures and thus they would be leaving the desktop behind. I too was filled with worry about that potential outcome but calmed myself remembering that I was not locked into one vendor for my OS any longer. In the intervening months however it has become clear that Ubuntu is not killing of the desktop, far from it. In fact the strides they are taking with Ubuntu 17.10 and Ubuntu 18.04 look like they are about to put out the strongest desktop offering to date. Not having to carry the weight of a phone platform, their own desktop environment, etc. has allowed their team to focus on giving positive contributions to Gnome proper. I’ve had the opportunity to play around with the Ubuntu 17.10 betas and have to say that I don’t think I’d be missing anything from my current Ubuntu experience. I look forward to upgrading to 18.04 when the time comes and no longer worrying about if one of my desktop baselines was going away.
On one of my classic computing Facebook Groups there was a post quoting Edsger Dijkstra stating, “It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.” It’s actually part of a much larger document where he condemns pretty much every higher order language of the day, IBM, the cliquish nature of the computing industry, and so on. Despite most of it being the equivalent of a Twitter rant, in fact each line is almost made for tweet sized bites, there are some legitimate gems in there; one relevant to this topic being, “The tools we use have a profound (and devious!) influence on our thinking habits, and, therefore, on our thinking abilities.” No, I don’t agree with the concept that starting with BASIC, or any other language, permanently breaks someone forever, but the nature of the tools we use driving our thinking means that it can lead to requiring us to unlearn bad habits. Yet has someone tried to actually write BASIC, as in the BASIC languages of the 60s, 70s, and early 80s, with actual design principles? Fortunately/unfortunately, I tried a while ago, with some interesting results.
While I’m obviously becoming quite enamored with Kotlin recently, this is like the early dating stage for me. Everything is great when you first start dating someone but it’s after you’ve been with them for awhile and see their warts, which everything and everyone has, that you finally decide whether it’s the right fit or not. While I’m very excited about being able to do a modern language in the JVM ecosystem to have my cake and eat it too I’m still not on the bandwagon fully yet in terms of recommending it for everyone. My homework for this is still underway, or actually in its infancy even. A quick list of the things I still need to weigh are:
- How well is it supported in the various IDEs and on the various platforms? I’ve been doing this in IntelliJ, the base platform from the original vendor that wrote it, but what about Eclipse, or Netbeans, or VSCode? Is it feasible much less reasonable to use this language for real in these environments as say compared to Java or .NET Core?
- How well does this perform compared to other languages? I’ve started exploring the whole benchmarking thing already but there is still a long way to go. I still have many areas I want to explore in terms of not only the raw performance of the base language compared to Java, .NET, and other languages but even questions like: How does their streaming capabilities compare to Java Streams, .NET LINQ, etc; how well does this work with numerically intensive applications since primitive types have to be boxed; etc…
- What is the tooling like for doing code coverage, code instrumentation, etc?
- How good is the documentation, tutorials, examples, etc. for new users and for weird corner cases with the language?
- How well can it be used in integrating with Java libraries like JavaFX, Spring, and other necessary components for building applications. My initial interactions are very positive but I need to do more real world use cases rather than rely on the smaller experiments I’ve already tried.
It’s like many things in life, I’m getting to the point of being committed in this Kotlin direction for projects that solely impact me. However if I’m going to recommend it for others or for larger scale things I need to have my ducks in a row and to have really been through the ringer with it to say whether it is the real deal or if I’m just in the honeymoon phase with a new tool.
As I wrote about here yesterday I am taking my exploration of Kotlin to the next level by looking at performance metrics using the Computer Language Benchmark Game. As of right now I’ve completed my first two steps: got the benchmark building/running Kotlin code, and doing a straight port of the suite (follow along at the Gitlab project). This was really 90% IntelliJ auto-converting followed by me fixing compiling errors (and submitting a nasty conversion bug that came up from one test back to JetBrains). So now onto the results! Well, actually not so fast on that one…
Like many people I got into my software development stack rut; complaining about the things I hated about it and why I wanted to change. The Java stack had been treating me ever increasingly well, especially with the refactoring of Spring into what it is today, but the language itself and it’s stagnation bothered me and had me starting to gaze at .NET now that it’s open source. I am now officially done with that exploration and will be sticking with the JVM-based system for the time being. This is driven by three major things: Kotlin, JavaFX, and the fully open source nature of the pieces I use.
I am totally loving ArsTechnica’s two part series on the history of the IBM-PC (Part 1, Part2). However there are some glaring omissions around the MS-DOS part of the story that I think they should have added in at least an afterward. My write-up here is based substantially on other articles but most importantly this article from the Computer History Museum.
There are certain things in life that you take for granted but didn’t know you did until you didn’t have them anymore. Swagger is definitely one of them.
As the whole “what happens to Unity” thing unfolds I decided to redouble my efforts in trying different distros again. I’m trying everything from trailing edge (latest Debian) to bleeding edge (Solus). As luck would have it it was time for me to refresh one of my development VMs so I decided to jump that one from Mint to Solus to give it a real world spin. My first impressions are that it is a really interesting distro and one I’ll keep playing with but there is one not-so-tiny problem that hopefully they will grow out of.
As much as I’ve never been a fan of Unity I’ve learned not to hate it as much as my host OS (and even in some of my VMs). Sure, my go-to desktops of late are mostly MATE distros or Cinnamon, but Unity hasn’t been completely unacceptable. With Ubuntu’s recent announcement of the demise of Unity and people openly pontificating on if this means Ubuntu is abandoning the desktop or looking to sell to someone like Microsoft who will then kill it on the desktop I started to analyze what this meant to me as a Linux desktop user. Is this the end of the road for that journey and therefore back to Mac or, god forbid, Windows?