Wednesday, April 9, 2014

XP Support ended Yesterday : What changes?

For many people nothing changes, the day Microsoft says they won't patch XP anymore.  However, if some major virus spreads like wildfire throughout the remaining XP system population on the internet, you can be Microsoft will release a "free" out of band patch for that vulnerability. It would be impossible for them not to.

For some people, and for some industries, XP really is dead.  My employer works in the healthcare market.  I believe that to operate a Windows XP machine in a healthcare setting, whether in a hospital, or a private clinic, would be tantamount to malpractice.  In the USA, and in Canada, there are laws about the steps you need to take to protect the privacy of your information.  You can not patch XP yourself, and you have no promises from Microsoft to ever patch any vulnerabilities in XP.  So, use of XP in a professional setting, especially in healthcare, would be, in my view, irresponsible at the least, and perhaps a violation of the law, in your country, if your laws protect patient privacy and hold those who fail to act responsibly to protect patients.  I am not saying anything bad WILL happen, but that something bad COULD happen, and it would be irresponsible for you to pretend that it couldn't.

XP really was a great operating system.  It was the first version of Windows that I really grew to like. The funny thing is that I can't stand operating on it any more.  I find it incredibly quirky and broken. I never liked Vista, and I loved Windows 7.  I know some people will think I'm crazy but I actually LIKE Windows 8.1 now. There are still some stupid things about it. The start "screen" and "metro" are stupid, and I hate them, but aside from the glaring stupidity of the metro crap,  Windows 8.1 is a pretty decent operating system.  I use it all day every day at work, and I like it.  I like client Hyper-V.  I like the responsiveness and overall speed of the system. When 8.1 Update 1 brings back a reasonable facsimile of the Windows 7 era start menu (non-full screen), I think many people will at last be willing to switch up from Windows 7.

As a Delphi developer,  I'm not sad to see XP go away.  It was a fantastic piece of technology in 2002, but like the machines that it ran in 2002, it belongs in a museum, not as a primary operating system on your home or work computer.  

XP introduced the desktop and home user to a fully Unicode compliant API, but true global Unicode features and functionality required installing a separate language pack for far-east language support.  XP introduced a fairly robust protected mode kernel, but ran your video drivers in ring zero, meaning that video card bugs would blue screen your system.   XP was actually really crappy, by comparison to windows 7, and in my opinion, windows 8.1 is just a tuned up kernel revision on windows 7, with a rather shabby new start menu, and a lame-duck "app store".   But below that lame layer, beats the heart of a pretty good operating system.   I find building apps in a modern Delphi version (XE5) on Windows 8.1 is pretty nice.  I like making my apps look like they belong on a modern computer in 2014.  Here are a few things I think need to die along with Windows XP:

- 16 x 16 toolbar icons.  This was a sensible choice when your laptop screen was 640x480. No more.

- Flat gray everywhere.  Give the clBtnFace look a rest, okay?

- Those stock bitmaps that shipped with Delphi 3.  Invest in some real art and icons, okay?

I suggest you celebrate the death of XP by moving your apps beyond that ancient classic windows look and feel, and into the 21st century.  XP is dead, long live Windows.

How are you all feeling about XP? Nostalgic? Still in love? Glad to see it go?  Planning to go down with the ship?



Wednesday, February 19, 2014

Interrupting Programmers (comic)

What happens to the "velocity" of a team that gets interrupted?  It's not hard to figure out.
Minimizing interruptions maximizes the work by a non-blocked programmer. Of course, a blocked programmer sometimes doesn't unblock herself/himself.   So there are no silver bullet solutions. Leaving people to flail for months, is no good. But neither is bugging them every five minutes.  

But if you are the sort of person who emails someone then walks over to their desk to tell them that you emailed them, then you need to read this even more than most people.

I found this over here, it's by Jason Heeris, and is licensed with a CC-type license.



Friday, February 14, 2014

Happy Nineteenth Birthday, Delphi!

Next year on this day Delphi will be 20 years old (Feb 14, 2015).   Time to think about what to do for the big Delphi is 20 birthday bash.

This year, Delphi is of legal drinking age in my home area (Ontario, Canada).  So tonight I will have a beer in honor of Delphi's 19th birthday. Just one. I'm a pretty moderate kind of guy.

Happy Birthday Delphi.  May the next 19 years be as highly productive, and as much fun, as the first 19.

Apparently Jacob over at Two-Desk Software is having a special in honor of Delphi's birthday. If you're interested in Castalia, go check out the special.
 





Saturday, January 18, 2014

The Best Programming Books on my Bookshelf. None are language specific.


I was one of the many people crying out for More Delphi Books, and I am glad to see that recently we have seen new books from Nick Hodges, and from others.  Today I'd like to cover the top five non-language-specific general books on programming that I think every developer should read.


1.  The Pragmatic Programmer - Andrew Hunt, David Thomas

This collection of snippets of advice is how I came to call myself a "Pragmatic" programmer.  I believe there are good principles in the "Agile" movement, but I hesitate to call myself "Agile".   I believe the practices in "Scrum" and "XP" can be done, and done well.  When I'm part of a team that takes those principles to heart and follows them, I'm happy to follow along too, when it works. But I don't take those monikers for myself, because I see "pragmatic" as the over-arching principle I follow. I do what works, and when it doesn't work, I change it up.  This book is chock full of fantastic advice. Some of the "get out there on the Internet and discover new stuff" sounds dated, as the last edition I have of this was updated around 2000, but the principles are timeless.   Incidentally my "change it up when it doesn't work" dictum is probably original to XP, and was borrowed by Scrum, and then by agile, so we're all cribbing from XP.  Nevertheless, of all the books I have read, there is no book that expresses more about how developers who do great work, and do it with reasonable efficiency work, in my opinion than this book.*



2. Clean Code - Robert C. Martin

Nobody, I mean nobody has written a book that cuts to the heart of modern professional Best Practices than Uncle Bob. Nobody.  Uncle Bob first hit my radar when he passionately and carefully expounded the principles of object oriented programming with the SOLID mnemonic. This book goes on to explain what it means to be a real software professional.  Yes, you have to unit test. But there's more.


3. Working Effectively with Legacy Code - Michael Feathers

How do I unit test that giant ball of mud that I can't rewrite?  This book answers that, and many other burning questions. I have not read any other book that gave me more hope that you can dig out of the holes of technical debt left for you to solve, than this book. Fine, now that you understand the benefits of cohesion and the single-responsibility-principle, the dangers of coupling, and the necessity of proving that your code works by writing code that proves the assertions you wish to believe about your code.  Now, how do you work with the code that you have right now, that does not now, and may not ever meet your ideals about Clean Code? This book can help.

4. The Mythical Man Month (and other essays) by Fred Brooks

No better book about the work that programmers do, the way we run software projects, and the sad blindness to historical lessons learned that seems to occur in our field, has ever been written than this book.   It contains a series of essays, each as good as the last.  Fred Brooks is my mentor, my hero, and my model of a software engineer, a craftsperson, a pragmatic programmer, a technical communicator, and an innovator in computing.   A must read.


5. The Secrets of Consulting by Gerald M Weinberg

This is a book about giving and receiving advice.  It is not pointed out nearly enough how much of a senior developer, or architect, or team leader's role consists in giving and receiving advice. Whether it is to someone who is your boss, or to someone you are being paid only temporarily (as in the title, Consulting), you are going to be better off if you read this book, and understand the lessons inside it.   This book tells you how to give your advice in a way that will be listened to.   If you want to be an effective technical communicator you must know how to deal with people, not just with machines. This book is a must-read for that reason.

What are your favorite programming books that don't cover any language, library, or specific tool, but rather cover the practice of being a programmer?

______

* I should point out that I haven't read Extreme Programming Explained, by Kent Beck, but I hear a lot of people say the same thing about the XP book. I did read the Agile Software Development With Scrum book, and was underwhelmed.    I do, however, agree with all the practices it mentions, but I feel that Agile often becomes, as Fred Brooks would call it, a "Silver Bullet", something we hope will save us.



Wednesday, January 1, 2014

Zombie Apocalypse Survival Toolkit Part 37: The Self Contained Software Environment

The following is an example of my weird sense of humor.  If you don't get it, or you aren't interested in Gedankenexperimentmy regular Delphi-themed blog content will resume tomorrow. 

Imagine you knew three weeks, or three months ahead of time, that there was going to be a major Zombie Apocalypse, or something else that would basically take the internet offline for the rest of your (possibly short) life.  Imagine, furthermore, that after some months you are able to get your basic human needs taken care of, such as a place to live, clean air, clean water, food, plumbing, electricity, and get back to life at, say, a 1950s level.   You still have your computer, but it's disconnected from the internet, and it's unlikely that anything like the internet will re-emerge in  less than, say 20 years.  You wish to be a computer guy, but you now find that the world in which you might do some computing requires you to be a bit of a rugged individualist. 

Windows is a non-starter in my self-contained post-disaster scenario.   For example, one fine day, due to some glitch in Windows activation, all your Microsoft windows operating systems cease to boot.  You don't have the source code to Windows, and you can't fix it yourself.  You are now out of the computing business.   Fine, you say, the world's over, I don't care.  But wait, it's not over...


I sometimes think about this idea of having a "self contained programming environment".   It basically has the following attributes:

1.  I have the source code to the whole thing, from the metal upwards.

2.  I can boot it up to a command prompt (live CD or Usb stick) or a full graphical desktop, without any installation, on any PC-compatible.   I can also do an operating system install onto a new system, after such a boot-up, either using an automated installer, or by hand, if necessary.

3.  I can repair anything that goes wrong with it, because it's constructed according to rational principles, and because I'm a smart person.


So let's think about how you would build this.  I can think of a number of ways to build it.    I thought about Linux but most of the Linux distributions are not designed in a way that makes them easy to use in my post-apocalyptic world.  Debian and Ubuntu have apt-build for source code rebuilding, including an apt-build world capability.   Some Source-Based Linuxes out there are pretty powerful, but I find them hard to use, there's stuff like Arch Linux, and Gentoo.

Personally, I think FreeBSD,  and DragonFly BSD have a lot to recommend them for my "zombie apocalypse" deployment scenario.  These systems are uniquely suited to my Zombie Apocalypse scenario, because they are far simpler to deploy, modify, understand and thus, repair than any Linux system, and the "ports" tree for them is much smaller, and represents a natural curated set of "stuff you would want on a desert island".    Remember that when they break you can no longer google any error messages.  A simpler system, with fewer lines of code, that is functional, will be easier to maintain than one that is larger.    The BSD kernel for example is several orders of magnitude simpler to understand than the Linux kernel.   The BSD user space layout is much simpler than the Linux distribution's layout, closer to classical Unix systems practices, and has a certain Zen to it that you have to experience to understand.

If you had a large amount of disk space to reserve for a Debian Network Mirror, say one terabyte,
and you could figure out how to keep it all working, and free of bit-rot, you could probably keep a Debian system going on your local area network.    By comparison, a complete mirror of FreeBSD or DragonFly BSD, because it contains a lot less software, would be much smaller and more portable.  It contains a lot higher percentage of the stuff you might wish you had (compilers, and such) and a lot less of the end-user stuff.  So, if you wanted to say, have a reasonable set of open source games, then I would go with Linux,  but if you wanted a more general purpose Hacking system, in the classic and noble Unix meaning of the term Hacker, not a cracker, but a wizard of software, then I would go with the BSD variants.

That zen of BSD thing makes it fun to plan with. For example, FreeBSD, and DragonFly BSD have this cool thing called the "ports tree", which is basically the ancient Unix Hacker way of getting software onto a Unix system.  Originally they used CVS (that ancient version control tool), but FreeBSD has switched to Subversion, and DragonFly BSD has switched to Git.    Because of Git's DVCS capabilities, I think it's ideal for a self-replicating deploy scenario that I think we would need while rebuilding after the zombie apocalypse.

Anyways, if the recent ice storm has you thinking about being a Survivalist, and your weird mind also wanders towards "Software Survivalism", check out Dragonfly BSD.   One USB portable drive of about 1 TB in size, and you've got a pretty amazing "self contained software survival kit" for PC compatible hardware.   Batteries included.  And source code too.  

Now if you'll excuse me, I'm off to plan a bit more about the water, the food, and the electricity parts of my survival plan.  Those might be a bit harder to figure out than the software side of my plans.
Also, I think I need a USB Geiger Counter, in case of Radioactive Zombie Apocalypse.

Tuesday, December 24, 2013

Merry Christmas to all.

Merry Christmas everyone. Here are some fun things for you to play with over the Christmas break.  Remember to put your laptop down and just hang out with those family people.

Some Toys to play with:
First, a list of interesting free stuff, a present from Scott Hanselman.

Some Delphi Specific Fun Things:

Apologies if you've seen these before, but hopefully these will be new to some of you.



  • The Itinerant Developer blogged about a Firemonkey Container for VCL forms in July, I missed it then, if you did too, check it out. Pretty cool. 
  • Nick Hodge's awesome new book mentions Delphi-Mocks by Vincent Parrett.    I'm playing with this right now, to increase my unit-testing powers. If you never read Vincent's intro post about it, this is a good read, even though it's been a year. I get the idea that lots of people still haven't seen the need for this yet, which means they aren't testing enough.
  • If you still haven't downloaded it, you owe it to yourself to go get Andreas Hausladen's latest IDE Fix Pack for your Delphi/RAD Studio.  Thanks Andreas for building such a great add-on. I use it and swear by it, for Delphi XE2, XE4, and XE5 which are the versions I have active coding duties in right now.

Update:  Wow, great link at The Old New Thing to James Micken's blog.  Worth reading everything that Raymond Chen has linked to there.  Every serious Delphi (or Win32 C/C++) developer should read Raymond Chen's blog, The Old New Thing.  But you all are already reading it, right? I especially love the first link, about systems programming, called "The Night Watch".









Monday, December 23, 2013

What do you lose by moving to distributed version control from Subversion? When is using Subversion the right choice?

This is a follow-up post with a few counterpoints to the last post about distributed version control.  There are a few things that I can see that Subversion does better than either GIT or Mercurial. In the interest of fairness, I will point them out here:

1.  It appears that Subversion, given adequate server hardware, and adequate local area network bandwidth, may be a better solution for teams who need to do checkouts more than 2 gigabytes in size, especially if you simply must check in binaries and version control them. Your mileage may vary, in your situation, you may find that the point of inflection is 750 megabytes.    A common example of this is video-game development, where assets easily hit 10-20 gigabytes, and are commonly versioned with Subversion.

2. Subversion has support for partial and  sparse checkouts, something that you don't get with distributed version controls systems,  and all the attempts to add sparse checkouts to DVCS have been so flawed that I would not use them.  The nearest relevant and useful DVCS equivalent  is submodules.  Most users who need to do partial checkouts in subversion will find that they want to investigate using submodules in a DVCS.  If submodules do not meet your needs, then maybe CVCS is best for your team. If you need different users to have different subsets of the repo, using scatter/gather workflows, or otherwise do sparse checkouts (svn co http://server/trunk/path/to/sub/module3 rather than being forced in Git or mercurial to do a clone which is equivalent roughly to svn co http://server/trunk/ ) you may find Subversion meets your needs better.  It is a common rookie mistake  to conflate DVCS repo scope with CVCS repo scope.   DVCS repos are typically simpler and smaller intentionally, rather than the subversion "this server is your whole code-universe" monster-mega-repo strategy that Subversion limits you to. 

3.  Subversion has global version commit numbering, that is your ONE and only Subversion server has a commit counter, and since this global asset is shared among everybody, you can never have a "commit #3" on one computer be anything other than "commit #3" on anyone else's computers. On Git and Mercurial the system generates globally unique hash tags instead to identify commits, and the commit numbers, if available, should generally just be ignored as they are different for each user.  For some workflows you might find this global commit numbering system suits you better than referring to the hex strings of 8 to 24 characters that identify your commits, which have no ordering.

If I've missed anything, I'll add it in an update. Anyways, those are the three golden reasons that I know of that some teams will want to evaluate DVCS, and then stick right where they are with Subversion, which by the way, does seem to me to be the best open-source CVCS out there right now.  I only wish there was a better GUI than Tortoise, and that they'd do a little work to make command line merging less stupid.

Update: Someone was confused about why you would want users to "generate" hash keys.  This means I didn't explain it properly. The version control system generates hash keys, and "hashing" means feeding the input of your changeset through a cryptographic hashing function. The possibility of a hash collision is so low, that you will never encounter one.  Git and Mercurial both use them, and I have never heard of a hash collision, ever.  My only reason for mentioning it is that in a distributed system there is no single incrementing counter available to give you unique incrementing numbers. Not a big deal, but something to know before you leap.  More info here. 

Update 2:  Today I spent some time fighting Subversion working copy folder corruption. Issues like this one that were a big problem in 2008 and 2010 are still a big problem in 2014.  That's bad news for Subversion users.

Update 3: The big thing you'll lose when you leave subversion behind is all the bugs and the missing features. Subversion, and TortoiseSVN are giant piles of bad design, festering technical debt, and the parts that work bug free still have glaring functional deficiencies.  I don't think I'd miss Subversion one bit if I could move the projects that use it to something else, I would.