I am a very political person, but am very respectful of the political beliefs of others. Many, many Americans were once gung-ho about the current Administration, but now that time has gone by, many feel let down and lied to, while others stand behind the belief that America is going to come out of things better than ever. I live in an area where it's extremely conservative, and all about God and guns, so I am wondering about the rest of the country. What is your opinion? How does the rest of America feel? Voting is private.