Hacker Newsnew | past | comments | ask | show | jobs | submit | simbilou's commentslogin

> That's a religious position. We have absolutely no idea what consciousness is or how it works.

Well we know that it exists, or at the very least seem to exists. And there is nothing in the physical word that we have understood so far that cannot be ultimatly expressed computationally—even if it’s usually not the most useful formalism. It is of course somewhat a leap of faith to then say _everything_ is computation (or can be expressed as computation), but I would argue that science has already made this leap in it’s infancy: The Book of Nature is written in the language of mathematics… Admittedly, modern philosophy of science is a bit more nuanced but the old the statement from Galileo basically stands. Now I agree that it’s not entirely rational, but it certainly isn’t “religious”.

So yeah, I think there are very good arguments to be made that conciousness can at least in principle emerge from some kind of computation. That computation could be a complete simulation of every atom in a brain or maybe some shortcut is possible it doesn’t matter for my point. That does not mean however that I think it can be made or will be made, since has you pointed out, we have no idea what we are talking about. I completely agree with Lanier in this, the tech industry should stop focusing on this nonsense.


It is of course somewhat a leap of faith to then say _everything_ is computation (or can be expressed as computation)

You don't need any kind of leap of faith to actually start working on it. That's the wonderful thing about AI and AGI more broadly. You can actually go work on it, today. Are you going to solve it immediately? Of course not, but at least you can chip away at the problems.


> And there is nothing in the physical word that we have understood so far that cannot be ultimatly expressed computationally

Except for those physical phenomena that can only be modelled with a recursive form that doesn't start with coefficients defined with infinite precision - which unfortunately includes a lot of useful things like weather prediction, and even the classical mechanics of n-body systems.

How about telling me what tomorrow's Dow close is going to be? Or the oil price four years from now? Or which parts of the universe that electron over there passed through on its way to your monitor?

I think a lot of the responses here just make the point for Lanier:

"There's a whole other problem area that has to do with neuroscience, where if we pretend we understand things before we do, we do damage to science, not just because we raise expectations and then fail to meet them repeatedly, but because we confuse generations of young scientists."

Anyone who believes that consciousness is computable by definition, because it just has to be, has a dilettante-level view of the problem. Philosophers have been arguing about this for centuries, and on the whole they're not so certain about it.

So I'll repeat - we don't have adequate models for human or even animal personality, or for emotional responsiveness, or even for the self-aware perception of qualia, which is apparently one of the key things that defines consciousness.

Big data isn't a solution to this, any more than throwing the works of Shakespeare into a database and looking for clustering statistics about word proximity and sentence structure will get you a new Julius Caesar.

The reality is there are levels of intelligent behaviour - especially self-aware creative behaviour - that are completely opaque to any modelling technique available in current CS. I believe that anyone who thinks the problem is simple and just needs moar code and a faster processor thrown at it is expressing a faith-based position of hopeful belief, not a reality-based fact.

Shakespeare is Shakespeare because the writing isn't word salad. Shakespeare is interesting because of the unusual the density and richness of the experiences that are referred to - not just in the plot, but in the details of the metaphors in each sentence.

You need to model experience before you can recreate that, and you can't do that unless you have a deep understanding of what experience is. Big data gives you slightly more focussed monkeys and more typewriters, but it's in no way a solution to the basic modelling problem.


I have to agree. The total lack of technical explanation is suspicious to say the least.

Also the 'recovered' color of Carole Lombard’s picture on the home page contradicts the one in this page http://www.solargreencolor.com/about_solar_green_color.html


Of course not. What makes you think it is?


Not sure I understand your question.

PS: What does [Flagkilled] mean?, I see it at the end of my post.


> I only know a handful of people (myself included) that believe Snapchat does delete your photos. Everyone else I know believes that Snapchat has some secret database somewhere with all of your photos on it.

It takes a special kind of stupid to even consider the idea that the photos are _really_ deleted.


Well, then I guess you're calling a majority of the population (including doctors, lawyers, artist, musicians, etc) "stupid".

What you really mean is someone who doesn't understand how technology (and tech companies) work.

This could easily be mitigated if we enforced privacy laws saying that if you tell someone something is deleted, then it must actually be securely deleted. They might even be on the books already, just enforce them.


If something goes online, it stays online. Proxies, long term caches, archives, you name it. Relying on privacy laws of secure deletion would be the same mistake.

Delete buttons should really be labelled as "hide stuff from everybody except us."


First of all, my comment was meant in the context of the article which is the use of technology by _teenagers_, not the general part of the population. And even so, yes people should know better, privacy has a big place in the public debate nowdays.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: