MacBook or XPS Linux Ultrabook…looks like a Mac after all

I have had two laptops for most of the twenty years: a personal laptop and a work laptop.  Before I owned my own company that was a question of the company’s I worked for policies.  While I had my own company it was about living by the same rules that applied to everyone else in the company (I’m a big fan of dogfooding anything I do).  Now that I’m on the individual consulting/developer bandwagon I’m in the same boat.  I have a pretty decent System76 Linux laptop that’s a couple years old but pretty bulky.  I have a positively ancient 2011 MacBook Air.  Disk space and speed wise it is fine.  Memory wise at 4 GB it’s starting to get a little cramped if I have too many Google Drive tabs open and the like.  Processor wise though it is a dog.  It’s at the point now where some sites like Facebook and Gmail can take tens of seconds to complete rendering.  At least they allow interactions while they finish parsing their JavaScript etc.

When the MacBook Pros came out it looked like that’d be a great opportunity to switch and while the feature set finally looked reasonable the price point for what I want was artifically high due to selections that I couldn’t get with other configurations.  It’s not that a particular end configuration was expensive compared to a Dell (or System76) but the fact I could get the exact configuration I wanted out of the Linux laptop and not the Apple made the actual price point lower by over $1000.  With the new MacBook Air models that just came out I decided to do my standard configuration.  For this one I’m doing a 13″ to replace my MacBook Air but with the reasonable bump ups to make it last a long time.  After all, I like to keep hardware for awhile so no reason to skimp out to have to replace it yet again in a couple years.

What were the end results?  When I put together the exact configuration I’m looking for in both systems the Mac comes out to $1599 compared to $1659 for the Dell.  That’s pretty astounding to me.  I finally have a Mac option that fits my need at essentially the same price point.  What does that mean?  First, it means that the imminent demise of my antique personal laptop is imminent.  Second, it means that it’s probably getting replaced with another Mac.

Diaspora API Dev Progress Report 14

Yesterday was the first day in several I could commit to real time towards D* again.  After getting back up to speed and making the status post I went on into the API development again.  I was able to make some good progress on some brand new endpoints.  The first one I worked, which is the first that needed from scratch coding of the main code, was the Tag Followings controller.  The day before I had struggled getting Rails to make the POST for creating tags work against the spec.  However after talking it over and thinking about it it was the spec that needed changing.  In another software framework I could just make it work but relying on the auto-wiring in Rails brought the design flaw nature to light.  With a simple change starting yesterday real development of the Tag Followings endpoint started.

The methodology I’m using when developing the new controllers is as follows.  First, I want to get the basic infrastructure in place and the tests.  That means that the first phase is to write the skeleton of the controller code, the skeleton of the RSpec tests, and to wire the two together.  I make sure that the routes behave the way I think they should according to the API Spec without worrying about returns etc.  The skeleton of the controller should implement all routes.  The skeleton of the unit tests should be testing for happy path and reasonable error conditions.  So that’s stuff like: the user passes the wrong ID for a post that they are trying to comment on, or an empty new tag to follow, etc.  I then go over to the external test application and code up the corresponding code in there as well.  With everything running I make sure that the endpoint is reachable from the outside (which it should be), but don’t worry about returns, processing etc.  If it’s possible to setup fake returns easily I do that otherwise I just ensure the proper methods are called.  After all of that is coded and committed then it is off to filling in the controller method by method.  For each one coded up I complete the unit tests and the external test harness interactions as well.  Once that’s all done then I move on to the next one.  In some cases, like Tag Followings, there needs to be refactoring elsewhere which has implications on the above flow.  I usually do those pieces before coding the controller.  It is at the design time that whether I should be using common code with another controller which may not exist as a Service component becomes apparent.  If I need to make any changes over  in other code I check that there are unit tests which properly cover the changes I am going to make, at least as best as I can tell, write those and then make the changes.  This should minimize the possibility of disruption.

When interacting with Frank R. on the merge requests one of the pieces of feedback I got was that with everything compressed down to one commit it was hard to tell why I did certain things.  As I code all of that is there but I’ve been rebasing everything down to one commit per endpoint so that when it comes time to merge the API branch into the main develop the log will look something like: Post API endpoint complete, Comments API endpoint complete, etc.  To get around this I’m trying a new flow.  When I think something is ready to be merged i’m doing a Work in Progress (WIP) Pull Request (PR).  That PR has the raw commit history and the name “WIP” in the leader of the label.  After a review and a thumbs up I’m going to rebase it down to one commit and then submit the final one for integration.  By the time WIP is done the code is feature complete however and should be ready to be merged.  I’m therefore counting WIP PR’s as the threshold for saying something is feature complete.

With all that said the three new endpoints that were feature complete as of yesterday are: Tag Followings, Aspects, and Reshares.

Diaspora API Dev Progress Report 13

After a week of distractions I finally have a new update on the progress.  We’ve successfully merged all the work done to date into the one main API branch and are now working on new features moving forward.  The first feature we have completed with full tests and test harness interaction is the ability to manage and work with the user’s followed tags.  So we have the full post lifecycle from before, and now tags done but not merged into the main branch yet.

 

Diaspora API Dev Progress Report 12

The merging of the various side branches into the main branch is coming along.  Because this isn’t being done as a primary job there is a bit of an expected delay between the pull request (PR) being generated and the branch being merged in.  This is giving me the opportunity to work on other features on Diaspora though.  The process is going along much faster than I expected it to, which is good.  At this point we have merged the Likes, Comments, and Post Endpoints together.  The PR on the Post Endpoint is now queued up however all of those changes exist in one branch.  What that means is that I was able to perform a full Post life cycle test using the test harness.  This means that we have an external application talking through the API and doing the following for a user:

  1. Creating a post
  2. Querying for the post and printing out it’s data
  3. Adding a comment to the post
  4. Liking to the post
  5. Printing out the comments and who liked the post
  6. Deleting their comment on a post
  7. Unliking a post
  8. Deleting a post

This is a very important step. Follow additional progress on the API Progress Google Sheet.