As everyone knows I don't do much talking in class but a lot of times I am thinking about a lot of things. Our last class caused me to question a particular more than usual. I remember one point we touched on was concerning our concept of objects. We said that, and please correct me if I'm wrong, objects don't have freedoms or responsibilities however, it made me think of three movies: A.I. Artificial Intelligence, I, Robot, and Biccentinial Man with Robin Williams. I was questioning to myself in class about whether or not people would agree that Robots did indeed have a free will or not, especially those in the movie I Robot? I say this in favor of the idea that to a degree, I believe robots have a free will and are able to do what they want, as long as it does extend outside of their programming. But some people may say well Robots are only programmed to do certain things and, although that may be true for our time now but if we take into reference the movies listed above it can reasonably be assumed that we are not too far off from these types of Robots existing. My main point about this is that Robots are objects yet, what would be said towards their 'future' ability to govern themselves and make decisions, even those decisions that may have the potential for harming their own creators. But, then there are those that still say well the robots in the future are still robots and they still cannot do things outside of their programming - they are limited. However, if we were only to look at robots as things that cannot exceed their programming are we not ourselves robots? We ourselves have limits. Like Dr. J likes to point out we cannot defy gravity, we cannot choose to be a certain height and other things; thus, we cannot go outside of our limits either. Also, some poeple say that well, you may be right about that Jordan but our freedom is God given whereas a robots is not. But, to that I will say this. Let's just pretend that you have a robot, he begins to act crazy, and what do you do? You say to him "If you don't act better I'm going to shut you down or reprogram you?" You ultimately threaten your robot. Does not God threaten those that do not follow his commandment with condemnation to hell? Religously, God created humans and humans have created Robots yet, humans CAN BE NOTHING MORE THAN the same as robots because both creators have given each of their creations limitations (sometimes the human creator even gives his robot creations less limitations that himself). Has he not given humans free will and humans "will," in a sense, one day give robots free will. It will still be the arguement that robots have programming but that indeed is still what humans live by themselves such as, the "Ten Commandments" and the bible. Also, just personally, I think that God does not give freedom because those that follow him are always living in the thought of "is what I'm doing okay with God?" There cannot be total freedom with strict regulations attached. When I think of this I think of the old Ford Motor Company propoganda "You can buy any color truck, as long as it's black." It's like God says you can do all these things for which I give you the opportunity, as long as you do them exactly as I say.