Faith is taking God at His Word. Faith looks at the promises that God has given and bring them to life in our lives. That is, if we are truly taking Him at His Word we will live a life based upon what He says.