The rise of the machines. Humanity may be in danger if no ethics are enforced.

bionic-man-2

Ethical concerns are always a topic of discussion when it comes to the intersection of technology and humanity, and as the rise of robots progresses, there are ethical questions which need to be addressed, says one expert.

Academic Bertolt Meyer, who is nicknamed “the bionic man,” said recently “that scientists and engineers should not be allowed to launch some technological advances on the open market without a prior ethical debate,” Britain’s Guardian newspaper reports.

The keynote speaker at the “FutureFest” in east London, Meyer – who has had a cutting-edge ?40,000 artificial lower arm and hand since 2009 – discussed whether the public should allow the economic laws of supply and demand dictate how mankind gravitates toward a probable “bionic” future, where bodies of those people with access and available finances will be able to augment and enhance themselves.

It they start to appeal to everyone, a mass market will develop,” he said, arguing that, in the throes of development, engineers on the cusp of research and development don’t always think through the impact of their work and the ethical aspects involved.

In a separate interview with the Observer, Meyer said he thought the business community would be “arrogant and naive” if it collectively continued to assume that commercial interests could and would solve ethical dilemmas on their own.

Sources

http://www.theguardian.com

http://www.sfgate.com

http://www.azcentral.com



Categories: Technology

Tags: , , , , , , ,

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: