Why you can't imitate the brain artificially.....

Unless it can be implemented in some way, the device/system can't make decisions based on mattering and self-deception at a deeper level. You ignore free will(assuming there is free will) and intention, motivation, and stories with a societal context. You also avoid accountability which boundaries and laws provide in a human system. Also, how can you simulate the amount of self-deception put into a neural network(the part that all human beings have of cognitive distortions that support value systems) or whatever mechanism you're trying to duplicate in the brain and self mattering and get the reflection upon lifetime or sequence of societal computers?

Can a computer do self-reflection?