Still trying to get this discussion back to my question. yes I know the Romans called it Germany(or something like it). But the Germans dont call it Germany, they call it Deutschland.Iwant to know who called it Deutschland first? The Germans, or the neighbours?And ditto England etc.