logoalt Hacker News

Chabsfflast Monday at 7:13 PM1 replyview on HN

Ok, but how do you go about measuring whether a black-box is doing that or not?

We don't apply that criteria when evaluating animal intelligence. We sort of take it for granted that humans at large do that, but not via any test that would satisfy an alien.

Why should we be imposing white-box constraints to machine intelligence when we can't do so for any other?


Replies

deadbabelast Monday at 7:18 PM

There is truly no such thing as a “black box” when it comes to software, there is only a limit to how much patience a human will have in understanding the entire system in all its massive complexity. It’s not like an organic brain.

show 3 replies