"The world is finally starting to ‘wake up’ to what Japan has to offer."
|Japan sure does love their lasers. Oh yeah, and women. Them, too.|
As Japan gained a lot of political and economic structure from the United States, especially following World War II and defeat by the U.S.-involved Allies, the U.S. has also gained quite a bit in the way of culture. From anime to manga, Pocky to sushi, many of the wealthy American staples come directly from Japan. So why is there still a stigma to being in love with Japanese culture?