Decision making circuits


 

Photo by Dan Coulter
Photo by Dan Coulter

Think that machines can’t decide for themselves?

Hard drives that can sense when they are being dropped. Computers that recognise when you are asleep. These are already on the shelves and in development. Electronic equipment is already starting to think, albeit insignificantly compared to the human mind, but its happening, and its going to continue to spread quickly and invisibly into our lives.

I believe we can help speed things up, because all of these electronic functions rely on one thing to be effective. Reacting to ourselves. Or specifically: our actions, based upon decisions made by our minds, on behalf of information from our 5 senses. And this is the key: our senses. If we begin to track and record the responses of our senses, we can build up a database of information ‘ready made’ for the technology of the future to plug in to ‘ourselves’ and the way we need them to behave for us.

Lets have an example:
Note the time of day we go to sleep and wake up. Do this over enough time and the computer will be able to build an approximate pattern for the time of year, each season, Monday-Friday, and how our social life (and therefore sleep pattern) changes at the weekend. It would be nice to have some software now, to enable me to do that conveniently. There currently isn’t.

Another?
Note our travel. But not just the times and locations and distances we do so, but also the reasons why we are traveling and what caused this to happen. E.g.: a friend asked, and we said yes (based upon the time of day, the distance and the location possibly).

The more data we capture, the more detailed ‘personal travel map’ we can create and use to perhaps inform our vehicle in advance when it may be required to start the engine. E.g.: If its a weekend, and I receive a phone call from my dad, and the call is very short, and there is nothing planned in my diary and I am within 15 mins of his location, it is very probable (based on my life pattern) that I will drive over to his location. However if my in-house electrical activity continues ‘as normal’ (as per the other data recorded) my vehicle can make the reasonable assumption that it will not be required, and perhaps use its already ‘on state’ to charge the car battery, fill up on the water levels, run a diagnostic check on its own electronics, or just go back to ‘sleep’.

Now from this example alone you can begin to see how much data is required by ourselves to collate and make available in a format computers, and in turn electronic circuitry, can make use of, and also how many different types of currently electronic and not currently electronic equipment will need to talk to each other. E.g.: my door lock is currently a manual key in the door type, but in future I’m assuming that it will be a motion sensor and / or fingerprint type – neatly connecting up to everything in my home.

My now electronic door lock can talk to my television / programme activity, and my fridge and make a reasonable decision when I am most likely to go out and grab a pint of milk for the next day. Based on the time of day and the connections to my previous activity it can decide when I walk up to it, if I am likely to be activating the deadlock so I can go to sleep at night or about to unlock it, allowing me to leave the building.

Expand this sensory profile data to lots of people and have everything reading the shared information and feeding our technology, and I could plan an evening – or perhaps my home could. It would be too easy to suggest an invitation for an evening with my partner as an example of what this talking technology could do. Maybe flick it around and have my technology ‘see’ that I’m doing nothing on a Friday evening, and the fridge has a certain number of items in it to make a nice curry. Nick loves curry, is a friend and is also available that evening, however the only decent films on TV that night is Horror, and Nick hates horror, so my sensory profile selects Matt and Ben, who love horror, love curry, and are available. All the selection factors are in place to create an open invitation, that I can be presented to me when I get home from work and either approve or reject the invite. 

If thats not ‘thinking technology’, then I don’t know what is.

The more we consider our decisions, the easier it is to see how it effects different equipment in our homes. Now don’t assume that I’m unaware of the complexity of our minds decision making process, as long as you are aware of the mind-numbingly dull and repetitive actions from the ‘creatures of habit’ that we all are. I’m not talking about computers having emotions and being creative – I’ll save that explanation for my second post – I’m refering to ‘tasks’ that have frequently identifiable factors in the decision making process, which computers can recognise.

Anyways, if you’re still with me, there’s more…

If we assume that the data from our 5 senses will become useful to our technology in future. Then what are we hanging about for? we have computers now, we have a method of recording our senses, why not begin?

I’d like to, however none of the software that I’ve been talking about exists yet, so how do I know what I capture now will be of any use then? The only language that I know computers have understood since they were invented and will (maybe unreasonably) assume will always exist in the future is binary. If I capture my data in binary form now, it is very possible I will be able to find a way of making use of it in the future. Hoo-raa!

Now Ones and Zero’s aren’t very exciting, but remember those multiple choice exam papers that were thrown at us at school? colour in the little box and bingo – our answer is read and can be scored by the computer. Why not use that approach? Create a personal sensory profile, based on set questions, for each of our senses.

Make the project excitingly ‘Tomorrows World / Sci-fi’ and enthuse many people and a percentage will run with it. Keep the momentum going and present the data as it builds and eventually someone will create a raw programme for it to read and display that captured data. In time, and with enough hands to work on it, electronics companies will begin to ‘read’ and ‘feed’ their technology our data based on the requirements for our sensory activity to peak (1) and trough (0).

Sounds like a massive open-source collaborative project to me!

Isn’t this enough to get started? I think so. Wouldn’t you like to have technology that can predict your mundane activities based upon your senses? Even if its just to ‘not’ illuminate the room based on the time of day you decide to walk around your home? Because maybe your partner is sleeping and you don’t want to wake them with lights, but your sensory profile, knows the curtains are open and its a full moon projecting enough light into your bedroom allowing you to see easily enough – Et Voila – no need for the motion detectors to flick on the lights!

You see, the more you think about how connecting technology can talk to each other, the more you can see the benefits of allowing it access to your data. Just because something hasn’t been built yet, that doesn’t mean it won’t and we can’t prepare for and possibly steer its inevitable production.

So, this is one idea. I don’t want it though. You can have it. I’ve got enough to think about without needing to get involved in the physicality’s of making this a reality, however if you want to discuss anything further, you know where I am – my computer can probably tell you.

mark

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s