My wife has a friend that traveled to France this past summer and went on a tour bus to Normandy. She said the tour guide was constantly talking about how bad America is and how the US and allied soldiers that liberated France was "worse" than the Germans. If that is is a common sentiment among the French now, then I'm not surprised by the above.
My father took part in the Normandy Invasion (came ashore a few days after D-Day). As a MP, he had lots of interaction with the locals, who he described as extremely unappreciative and generally bitching about everything American. A common theme was that the invasion tore their farms up and that the American Army should pay for the damages.
As a side note, I should add that some of the French ladies were most appreciative of the G.I's. One of the guys in my dad's unit bore a striking resemblance to the French actor Fernandel, which ensured that there was a flock of young women around when things were calm.