I think that France was seen as the USA's best friend and strongest ally right up to the end of WWI. Then Great Britain supplanted France as the number one American ally and has remained so since (if you don't count Israel).
There were a number of reasons why Americans felt close to France in the pre-WWI era. One was that the French gave very significant help to the Americans during the American Revolution. Not only with Lafayette and his troops on the ground, but the presence of the French fleet (which had defeated and driven away a large British naval force) made certain the American victory at Yorktown, the decisive battle of that war. So France was seen as a great ally against the British, and remained so for some time.
It was only the rise of the modern state of Germany under Bismark's brilliant guidance that changed that situation...eventually driving the French and British into one another's arms, as it were, through their common fear of Germany's imperial ambitions.
I know that during Mark Twain's life, for example, the French were seen as the natural ally of Americans, and were much admired, while the British were regarded with some measure of hostility.
Another reason for a close relationship between Americas and French was that the French, like the USA, had rejected monarchy in the 1700s and turned to a republican form of government. That was a very significant political issue in the 1700s and 1800s, and it was another thing that made Americans like France better than Great Britain, despite sharing a common language with the English.
All that changed during WWI, and the USA and Great Britain became staunch allies from that point on, while the American affection for France dwindled to some extent...possibly due to French intransigence at the Treaty of Versailles, and later due to the early French surrender in WWII and various unfortunate incidents where the Anglo-American Allies got into combat with Vichy French forces in North Africa.
The French Army was long seen by many as the finest army in continental Europe, possibly in the world, but they lost that reputation gradually with their catastrophic defeat in the Franco-Prussian War and subsequent weakening of the French position in Europe.
It was really the rise of Germany which ended the primacy of France in western Europe. The Germans were more populous than the French, and they proved to be superior in the art of war.