Science Demystified

One often hears the remark “Well Science has proven that…”. This has always sounded very strange to me. Is Science a living thing? How can it prove something if it’s not living? Science without a living being is as dead as any piece of matter. So that’s why this statement is strange, it gives people the idea that Science is some kind of living thing, a kind of inexorable dictator that says it as it is. Well that is not the case.

Well lets get the definition of the word Science from Dictionary.com:

Science

noun

  • a branch of knowledge or study dealing with a body of facts or truths systematically arranged and showing the operation of general laws:the mathematical sciences.
  • systematic knowledge of the physical or material world gained through observation and experimentation.
  • any of the branches of natural or physical science.
  • systematized knowledge in general.
  • knowledge, as of facts or principles; knowledge gained by systematic study.
  • a particular branch of knowledge.
  • skill, especially reflecting a precise application of facts or principles; proficiency.

So its very clear that Science can not prove anything, only a Scientist can or anyone can actually using logic and observation.

All you need to prove anything is to physically demonstrate it. For example one can prove that oil floats on water by pouring it into a glass of water and observing the result. You don’t have to “prove” anything or do any convincing, you just have to let someone observe what you are doing and he or she can see for themselves! There is no convincing involved just getting someone to look for themselves and see.

As soon as someone has to refer to something that one must just believe or trust, instead of insisting that you look and observe for yourself you are opening the door to tyranny and being duped.

Sometimes to demonstrate something it takes many steps, so many that we decide to trust someone else to do the “Observation” and tell us what the result is. This can be abused and is abused today, and is used to manipulate people so that they believe or trust blatant falsehoods! A typical example of this is antidepressants that mostly increase the risk of suicide, yet are marketed as antidepressants. There must be an huge mis observation here!

Looking around one can see many examples of other’s false observation being put out and accepted on the base of authority.

Don’t buy the “It’s too complicated” lie. There is almost nothing that can’t be communicated and observed if one just learns all the terms used and takes the time and effort to look and observe.

The job of the Scientist is to observe and discover new things and ideas AND communicate the results so that we too can observe what he saw. This cuts out all the questions of the “Ethics” of Science. The ethics consist only of being honest with what was observed and letting others observe as well.

One trick is to create a “theory” and if it is made obscure enough people can just start believing it is true! There needs to be honesty here, it is always a theory. A theory needs to be tested and once it shows out in practice, it is no longer a theory it’s an observable fact.

Too many times a theory is proposed and some tests match that theory and then that theory is said to be fact, when all that is proven is that some instances of observation satisfy that theory.

So Science is an organised body of knowledge, and body of observations and patterns proposed that describe the observations and help predict them. Science doesn’t prove anything it is just the organization of what is observed.

Artificial Intelligence

I’ve been wanting to write something on this topic for quite a while now. Artificial Intelligence or AI as it is known was initially made popular by Alan Turing last century.

It then seemed to die down in popularity and recently has made a come back.

I am interested in writing about it because I believe it is filled with a number of myths and misunderstandings. To fully explain what I am talking about I am going to have to delve into some very basic human questions. I am going to have to go where modern day science as kept away from in droves for whatever reason. This sound controversial? Well it is! You see when we talk about artificial intelligence we need to get our terms defined first. (Oh and here’s another one, defining terms. Isn’t that obvious? Define ones terms or get them defined? No it’s not! Quick test: What is the definition of Computer? Do you know the definition right away? No? You need to look it up!)

So looking up the definition in Merriam Webster dictionary we get the following:

Artificial Intelligence – Noun

  1. : a branch of computer science dealing with the simulation of intelligent behaviour in computers
  2. : the capability of a machine to imitate intelligent human behaviour.

And just to keep all bases covered here is the definition of Intelligence:

Intelligence – Adjective

  1. a (1) : the ability to or understand or to deal with new and trying situations : Reason; also the skilled use of reason. (2)  : the ability to apply knowledge to manipulate one’s environment or to think abstractly as measured by objective criteria (such as tests).

So artificial intelligence is tied up with computers, or machines that can store, retrieve and process data. And then artificial intelligence will be storing, retrieving and processing data in its most complex and powerful sense, but in the end still only storing, retrieving and processing data. The main advantage of computers is the ability to do the above three basic operations fast and on huge amounts of data.

So all a computer can do is store, retrieve and process data. Then we can see that the limits of artificial intelligence are what can be accomplished in storing, retrieving and processing data. This is an important point, because my question is, can intelligence be encapsulated in storing, retrieving and processing of data?

Intelligence by its very nature goes beyond merely storing, retrieving and processing data. Also every approach the computer may use has to be programmed by a human, but each step the human adds to it is a rote step based on comparing input data to its internal database and processing that into some response. This action is not intelligence, and will never be intelligence. It may seem like it some times, but it is by its very nature just programmed responses captured by the programmer.

This does not mean it can’t help analyse large amounts of data. But it does mean that a human is needed to interpret the output and spot insights not programmed into the computer.

I mentioned controversy above, well here is where it enters in: intelligence is life, and a computer no matter how it is programmed is not alive. A computer can never get a new idea, but life can! A computer can only give out what it has been priorly programmed to give out. Life originates totally new things new ideas and is able to introspect. A computer has to be programmed to look like its introspecting, but then it’s not introspecting it’s just doing the programmed response that it has been set up with.

Don’t believe the media hype that computers can understand etcetera. They are just providing programmed responses. Life originates, it is there! It senses, computers can’t sense, only react!

Have you ever had someone give you a fake smile? What made it fake? It was the intention behind it. Now you as living being can sense that because it came from a living being.

The point I’m making is the difference between a machine and a living thing. One can get caught up in the hype and forget the difference!

So can artificial intelligence replace a human? Well only a very machine like human, the type you wouldn’t want to employ anyway and who wouldn’t really want the job you were offering!

People use machines and computers, computers don’t use computers! Thats the difference! And the only reason for computers is that there are living beings who either need them or want them for the task at hand!

It also comes down to what we are doing in life. Are we living to increase the life and enjoyment of all or are we trying to cut other people out of the game? Artificial Intelligence should make it possible for more people to participate in even greater production, not take away someone’s ability to participate.

So no real jobs should be lost by artificial intelligence despite the hype we hear. Actually artificial intelligence is the computer doing what computers do: store, retrieve and process data at high speed in high volumes.

Are your programmers working for you?

Well I’m cruising 36,000 feet above the Atlantic cramped in economy class with better things to do. I came across this article in CIO about developers and what they get up to.

The gist of the story is that developers usually have their own agenda which is at variance with the employer’s interests.  I looked back on my experiences in this line and saw many occurrences of this.

Many years back I had a realization: most developers (programmers in other words) don’t really know what their jobs are!

This may sound a bit alarmist or snide, but I have found it to be an actual fact. After handling this stumbling block in my business I started to get more reliable and usable systems developed. Our company actually started challenging companies 10 times larger and winning!  I had overcome the so called ‘Technology debt’ or main stumbling block in development: developing effective, re-usable and maintainable code.

Software devellopement companies don’t get it, but one of their biggest stops is what I call developer inertia and employee individuation.

These two things are the bane of software development.

By developer inertia I am referring to things like: ‘It needs to be rewritten in Java’, ‘I don’t do C++’, ‘It’s very complicated to do’, ‘I can’t tell how long it will take’, ‘It’s not object oriented’ and so on. You see in my mind any programmer who comes up with one of these is not a programmer but a ‘Hobbyist’! I hope that doesn’t offend anyone but if you look at it that’s what it shows. A professional gets the job done, he or she creates a solution that fulfills the business need because he actually gets what is needed and sees the bigger picture.

By employee individuation I am referring to the guy (or gal) who seperates himself from his team, his employer. He is the one who tries out the latest fad in some development because he will learn something new, not help his employer by delivering an effective solution. He is also the one who says it can’t be done, but then shoots down an external contractor who is willing to actually do it! Any of these sound familiar? I’ve had my share, both as the employer and the external consultant trying to get a good solution.

Well behind these two is just the fact that the employed programmer or consulting programmer does not know what his job really is! It’s that simple. The moment you sign your employment contract you are agreeing to not be a hobbyist but someone who helps the company by creating effective and maintainable/reusable code. It’s no longer just for your fun. The fun expands to being fun to make your employer/company win! That is the difference. That’s the no individuation. That’s being a part of the team, part of the group and so on.

This is not necessarily the programmer’s fault. It’s the employer’s responsibility to make the programmer part of the team and inform him clearly of what is wanted. Here also the employer needs some knowledge as well so he is not hoodwinked. The programmer needs to be a real team player with the teams goals in mind, then we all win.

Owning your knowledge

Everything we do in life concerns knowledge. Some understanding of the subject is required to accomppish anything with it.

If we broaden this outlook we see that in essence life is about knowledge! When you know something well you can excel at it. Also you are less likely to become effect of it.

Now in this world of patents and rights the question arises: who owns any piece of knowledge?

Interesting question. However patents and intellectual ‘Property’ actually control usage of certain knowledge for commercial ends, NOT ownership of that knowledge!

Yes! You can own and should own any knowledge you garner. Its just that for business reasons etcetera we have some limitations of using certain pieces of knowledge for commercial purposes. That is by agreement and just by living in a country and being a citizen you are inplicitly agreeing to some things, we usually call these things laws.

However you can actually own any piece of knowledge! What does owning knowledge mean? It means that you can think with it as your own, work out further ideas with it as your own, and develop your own new ideas from it, as your own. You may even develop your own intellectual property this way.

The important concept here is you own all knowledge you can lay your hands on. You can think with it and build your own ideas. Just keep in mind that if something is patented then you can’t use it commercially, but you can still get good new ideas from it.

Owning your knowledge means taking full responsibility for it and means you can really use it and create actual effects from it in the world outside.

Owning your knowledge makes you effective with it, and is the only real way to succeed with it.

Learn lots of things and never stop learning, but make sure you own your knowledge!

The ‘innovation’ myth

We hear a lot about innovation in the software industry these days. It is sold as the ultimate thing to aim for in any software endeavor.

I was looking at this the other day and it seemed like there was something missing there.

Lets look at really successful technology companies.

How about Microsoft. Well for one they are successful. Like it or not they are, but are they innovative? Im sure they have introduced some innovations, but none spring to mind as I look! Thats interesting! How about Google. Now they are definitely very successful, and what innovation did they bring to the table? Search engine? No! I think Yahoo was first!

Ok lets look at the biggest tech company by market capitalization, yes good old Apple. Now they are just full of innovation! Are they?! Well HTC came out with a touch phone before them I think! I had it actually, not a bad device. They didn’t invent the Personal Computer.

MMM So where is all this innovation! Or maybe to phrase it better where are all the companies that created all these innovations!

I think this idea of ‘Innovation’ is just a red herring. Its what computer magazines like to talk about to get interest!

All the above successful companies improved their products, found out what was needed and wanted and then produced it. Thats all they did! People needed an operating system and office suite for their PC’s and Microsoft provided that, and they still do! They did not innovate! They just provided what was needed and wanted and put it out there!

Despite all the press around these things, thats all Apple, Microsoft, Google, Oracle and others that have some success have been doing.

Apple listened to their users and kept perfecting their iPhone to the point where it now has a dedicated following. Google did the same with their search engine, and then later with Android! No real innovation per se in both these camps.

So the concept of innovation hides what was really done to achieve success. People can spend time endlessly trying to innovate, instead of just providing what is needed and wanted and doing a good job at just that!

So the moral of the story is don’t worry about innovation. Just focus in being creative providing what is needed and wanted, and do a good creative job at that and I guarantee you you will succeed!

Find out for yourself

One lesson I have learned through the years, and have needed to re-learn at some points is a lesson that I feel could change the face of innovation in the software industry.

Unfortunately schooling these days teaches a pupil to depend on authority, to depend on other peoples evaluations and ideas. These ‘other’ people are supposed to be ‘authorities’. They are supposed to be the ones to follow etcetera. They look, observe and evaluate and then tell us what we need to do! So what they sell is second hand observation and even riskier second hand evaluation of data.

However this creates a closed framework of limited innovation, because one now ceases to innovate, but rather relies on the observation and evaluation of others, ‘authorities’ etcetera.

We notice that all true innovation comes from those who didn’t follow authorities, those who dropped out of university or college etcetera. Good examples are Steve Jobs – he dropped out of college, Bill gates who did the same, and Larry Ellison – Co founder of Oracle Corporation didn’t complete his University studies.

Its interesting that modern education does not emphasise research, only it emphasises learning what ‘authorities’ report. And one notices that all true innovation comes from those who look and find out for themselves.

We need lots more people to do this. The current system is open to exploitation by those who would like to limit competition, of those who have vested interests, of those who are even scared of competition.

I recall once when I wanted to make my own spatial display engine. I was excited on making a spatial engine that is geared towards large data sets and working with databases, as I felt that was what is needed to provide spatial solutions in the future. Its amazing how much discouragement I got! People telling me that there are big corporations that do this for a living, namely ESRI etcetera. That they have programming teams of hundreds, so how could I compete with that. I was told that companies like Bentley Systems with their Microstation product have an excellent spatial engine.

I was surprised, because neither of these ‘big’ companies had a spatial engine geared towards databases and huge datasets. I know these companies were pretty happy and were making good revenue, but why on earth should the fact that they are big and successful be a reason for me not to create my spatial engine with some features better that theirs?

Luckily I did not listen to these ‘helpful’ warnings. I loved the excitement of creation and went ahead on my own time and money to build my spatial engine using the latest technology. And guess what, despite all the ‘competition’ my spatial engine pays for all my bills, has allowed me to pay 100’s of thousand s of dollars in humanitarian donations, and given me oodles of fun in the process!

I have even beaten companies 10 to 20 times my size in tenders for spatial software using my spatial engine!

So I learnt my lesson! Find out for myself! Don’t listen to others evaluations. Rather on hearing them look for yourself. Because who knows maybe you will create the next Apple!, the next Microsoft, who knows!

I just recall an example from early in my career. It was the 80’s. Windows was very young, Unix was the flavour of the day, and DOS or OS/2 was what you used on PC’s. The Mac was too expensive! I wanted to develop system administration software that could run on Unix and Dos and OS/2.

I had an idea, write my own language and virtual machine for it, so the same code could run on these platforms, all I needed was to write an implementation of the virtual machine on each platform.  I wanted it to support a text windows type interface.

My one colleague an MSC in computer science warned me that compiler writing was very tricky, and very advanced, and that it was difficult to verify that your compiler was correct etcetera! Well I was too keen for some fun so I want ahead and designed my own language! It even had local functions, types and records, much like Pascal. I wrote a compiler for it, and an assembly language for my virtual machine. I did one new thing though, I designed the virtual machine so that execution state could be saved and restored, even on a different machine. I wanted to use this for workflow, where workflow could suspend itself, transport to another computer and resume execution there!

Well I completed this little project and got it up and running, and used it for system admin scripts on DOS, Unix (SCO Unix) and OS/2!

It may sound familiar to you Java guys, but I did this before Java came on the scene!

Point is I as a young programmer could design and impliment my own language and virtual machine! Had I listened to the expert I would have lost out on all the fun, and not provided a neat solution to my employer at the time!

So before you just take someone else’s evaluation on a subject, look for yourself! Empower yourself, you are a lot brighter and able than you think! I know! I have worked with so many programmers probably over 100 of them, and they only acted stupidly when they didn’t look for themselves and acted on third party data!

The Reason Why

Have you ever wondered what it is that slows software projects? What is it that makes some projects miss their deadlines and others succeed?

What is the key item that puts spanners in the works on some development endeavours? Surely there is something, that if we handle can prevent these key failures in the software industry.

Well I know there is! And this is the subject of my post.

Now I am not going to give you a theory. Too many do that kind of thing and think that’s the be all and end all. If I give an answer to this question, it needs to be able to be tested, and if applied does it speed up development? or not? If it does it’s correct in that case, if not it’s false.

And if it’s correct in a majority of cases then we have a good rule we can live by and get results from.

Ok so I have stuck my neck out, but let’s see how this can be.

Imagine you are working on a software program. It is meant to read a text file and count the words in it. Simple application. So you look up the Application Program Interface (API for short) for opening and reading text from files. Seems simple enough. Ok so program is coded and now we run it as a test. What happens is you get a message printed on the console: “An error occurred”.

Ok so it’s not working. Well what now… Ok let’s use the most famous debug method (printf). Before I call the API to open the file I print a message, then after it is opened I print a message.  After running again I see the first message, and then the famous “An error occurred” message.

Good so it must be that my API call to open the file has an error. But what am I doing wrong?  Ok so i go back to the documentation and try figure out what can be causing this error. I see that if the file does not exist it gives a “File not found error”. Ok then let’s rename the file it is trying to open and run the program again. It now says “File not found”. Good I see a link between the existence of the file and the program. Ok let’s rename it back and run again just to see. “An error occurred”!!!! Ok what can this be?

Anyway I get desperate and decide to delete the file to see what happens then. Guess what? When I try to delete the file I get a message “Insufficient privilege”! I re-login as an administrator and try to run the program, again, and….. it worked!

What is the lesson we can learn from the above? I must agree that the example is a bit fabricated, but I guarantee you that the scenario painted above can be exactly reproduced in almost all languages! I know I’ve been there and suffered that!

Now imagine if after the first run above the error message was “Unable to open file ‘Test.txt’ for reading: The calling user does not have read access”. I would have fixed the attributes of the file or my privileges and got the program working. It would have saved me 20 minutes.

So the incomplete error reporting caused me to lose 20 minutes of valuable production time.

Here is another scenario: My boss asks me to write a mobile app, using IOS and the Swift programming language. I have only programmed in C and C# and am very keen to try something new. Who hasn’t had this happen?

Anyway I download my “Beginners guide to Swift”, and “Programming using Cocoa Touch” and get reading and practicing. Takes me two weeks, and I can shakily put together a recipe creation app. I find Swift has many idiosyncracies I am unfamiliar with, so I make quite a few errors.

Well after a few weeks I get something together.

If I was an IOS Swift programmer it may have taken me two days, here I caused a loss of 10 working days. Sure the next project will be quicker, but in this iteration I lost 10 days, and that is what I’m interested in here.

What is the common denominator of the above two scenarios?

Insufficient data! and maybe incorrect data!

Now I want you to scan through all the software problems and failures you have had and see what you can find. I bet you it was either insufficient data, or incorrect data or both!

In my whole career in software, I have seen this to be the sole cause of all slows and failures.

This is a sweeping statement, but it is true!

It can manifest itself in many ways. One subtle way is a programmer not having a correct definition or understanding of something. If that is the case I guarantee you that that programmer will be slow and fail in anything that has to do with that misunderstood concept or thing.

Unfortunately bad error reporting is the main culprit in this area. In my business I threaten to fire programmers who fail to give detailed and explicit error messages. If the message should not be visible to end users, then it damn well better be written to some log file.

All the big software providers fail in this area, but I am afraid big culprits are Oracle! Their most famous one is the error “Illegal character”, if you put in a closing semi-colon on some statements, but not on others! They don’t even tell which character is illegal! I have spent literally weeks getting software to work on the Oracle database, when in Microsoft SQL Server it took only a few days! All the slows were caused by this insufficient data, and sometimes incorrect data.

Detailed and precise error reporting quickly removes all mysteries, saves time and gives confidence.

On the other side of the coin, get yourself skilled as much as possible, clear all the terms involved to full understanding, and get all the data you can in the subject you are programming in. That way you will be fast and create great software products.

Apprenticeships – the missing ingredient

I started my first computer job in 1985, July to be exact. Just after completing what was called “National Service”.

Lucky for me I got a good amount of computer experience doing my National Service. I learned RTL2, RSX 11M the PDP Operating System, and got to do some numerical optimisation algorithms that were tested in real life!

Anyway when I arrived for my new job as Software Designer (mmm better than Programmer) my boss Greg threw a copy of “The C Programming Language” by Kernigan and Ritchie onto my desk and said: well around here we are going to use this language.

Eager to please I began devouring the book. What was nice it had lots of practical examples. I worked my way through and in just under a week I showed my boss my word counting program. He said I must write a spell checker, with a dynamically updatable dictionary. I started to get used to the idea of parsing text streams. And this was getting exciting.

Now unknown to me, It seems that somehow I had learnt this language in under two weeks and the other old hats had not got to grips with it yet! Just shows you, If you have enough necessity you can do most anything. Well I had no clue, but really enjoyed all the things I could do with this language.

Next we needed a document pre-processor. It needed to include and exclude sections of text based on command line arguments, It had “If” sections and allowed inclusions and exclusions of text in the “If” section or “else” sections.

Challenge was nesting these, and I made my first use of recursion to handle this simply. I even had boolean (Evaluating to True or False) expressions in the “If” sections. This started a monster which I will cover later.

Anyway the point I am making here is that I was apprenticed by Greg and Chris both experts in this programming field. They tasked me and corrected me and kept me learning. This gave me confidence, and caused a large transfer of competence from these two bosses with years of experience to me.

I am very grateful to these mentors. After working with them I got the confidence, I knew how to code, I could do anything in code, all I needed was business requirements!

The sad thing is I don’t see that happening these days. I have seen so many Honours and Masters graduates starting at companies and not knowing HOW to get things done in programming. They have a fair amount of data about programming and what is incomputable and what is provably correct et cetera, but no knowledge how to DO programming and create systems.  In fact what I noticed most was that the longer they had studied, they more they knew was impossible or very difficult to do in computing!

The drop in expertise of the average programmer is very evident to me looking back over these times. And I think the key element causing this is this dropping out of Apprenticeship.

Real time programming

In my first actual job in 1985 we did real time programming on Digital Equipment Corporation VAX computers.

I liked these machines vax780, they had 32 bit address space, so no fiddling with selectors or active page registers.

However compared to today they were highly primitive. My Apple watch has more memory than those VAX machines, and who knows maybe more processing power!

Anyway we programmed in RTL2.

My job was to design faster versions of the built in operating system functions for synchronising processes and inter process communication.

We had separate VAX machines connected via shared memory. That is memory that is accessible to both computers.

I wrote code to synchronise the clocks using the shared memory once. It was amazing. When I ran the program, because the two computers were also connected on a LAN and the LAN also synchronised the system times, it caused a system crash. It couldn’t handle the times synchronising so quickly!

I also has to write a program to  close all programs the system for shutdown. I ran it first time and it came back with an access not allowed error, so I asked the system admin to grant me access to test my program. When I ran it again, people started                                  saying that their terminal sessions were suddenly ending. I realised that I failed to check that the processes where in the correct group!, and then mercifully the next bug showed up. The process terminated itself by mistake, saving the other users from losing their sessions. I did testing after hours after that!

I learnt a lot about process synchronisation those days. We didn’t have threads then only processes!

Its interesting how things have changed. In those days Real Time Programming was a separate discipline, nowadays with the improved hardware speed its hardly ever mentioned anymore. I think all we have to do these days in programming if it is real time is to ensure we don’t have garbage collection threads suddenly consuming cpu cycles when we need performance!

Actually the UI of any decent smartphone is using real time programming fundamentals! (In my humble opinion!)

I enjoyed the Real Time World those days, we were considered “System Programmers” not just ordinary programmers.

Anyway the contract ended and new more interesting things were becoming available GUI’s and graphics, and Windows had just become 32 bit!…..