‘THINKING’ MACHINES: Well, How do you Think About Them?

There is this speculation that ‘Carnegie Mellon University’ is anticipated to announce the initiation of a centre to scrutinise the ethics of Artificial Intelligence. This is the topic for today on CerePlast and you’ll surely find it interesting. In the history of the technical world, it is the one that can leave a techie awestruck as the Pittsburgh’s K&L Gates Law Firm is investing $10 million into it.


Source: BBC

Unlike the others, this centre won’t be concentrating on business, management or higher studies like most of the centres do. The main objective will remain to question the simple yet practical things, for example, how to use executioner aircraft in a war province.

It primarily focuses on humans and their behaviour and perceptions towards situations. And also on what they think of their schemes and actions. All of this is to be done- only to understand the concept of Artificial Intelligence better and deeper.

For the Artificial Intelligence to be applied more extensively and rightly, there needs a deep knowledge of how will a human react to a situation. And the same needs to be programmed on a machine. In the field of medicine, this has successfully worked in a few areas. Reason being, there’s a definitive and a fixed approach to be followed. Less of thinking is required in this area when compared to the areas of Army, Navy or Airforce, basically those areas, which include the use of ammunitions.

How will a machine indicate to a human that it isn’t in a condition to control itself well, anymore? How will the human program the machine to not analyse particular data sets?

Questions like these need to be answered for taking Artificial Intelligence to another level and for which a long, thoughtful process will be required. And for this very reason, this centre is possibly being set up at such colossal cost. All in all, this centre is all about ‘THOUGHTS’ and ‘PERCEPTIONS’ of and about a critical situation and not about -how to write an extensive code.