Peter Sweeney
2 min readFeb 21, 2019

--

I agree with you: To say something is computable does not guarantee we have the resources to consider it a feasible solution. I reject the popular notion that more resources will necessarily lead to AGI.

I also agree that some explanation or physical law is not Nature itself. For example, as it relates to AI, I believe our explanations are most lacking in the area of creativity and conjecture. We don’t have a good philosophical understanding of the problem, yet alone a formal solution.

I’m also not a proponent of the idea of simulated humans as exemplars for AI. I believe a more narrow view focused on scientific discovery (anticipating a solution to the creative gap mentioned above), is a more promising, less bloated vision.

I’m familiar with Penrose’s argument as it intersects with the points above and his comments on the Turing principle and Gödel. He rejects the Turing principle, but understands a better explanation requires new physics and processes, yet to be discovered.

Unlike many who lazily conclude that creative AI is forever and necessarily human, or something supernatural or immaterial, Penrose sees it as a problem rooted in mathematics and physics. He sees (mathematical) creativity as the root of the problem. But in this world, there’s no universal machine, so there would be some thought processes that would be beyond machines.

I welcome new explanations; it moves things forward. The reason we got started on this was thread to start is that I struggle to find a good explanation in the original article. So until that time, I have the Turing principle and the claim to a computable world.

--

--

Peter Sweeney
Peter Sweeney

Written by Peter Sweeney

Entrepreneur and inventor | 4 startups, 80+ patents | Writes on the science and philosophy of problem solving. Peter@ExplainableStartup.com | @petersweeney

Responses (1)