No, I don't think so. I don't think that a health care act would suddenly cause a complete shift in America's political landscape. Having a more equal and fair healthcare system shouldn't be political at all, it should be something that every country strives for.
I live in the UK, where we have the National Health Service (NHS). In this system everybody contributes towards the upkeep and provision of the NHS through taxation. This system is extremely fair (everybody has 'free' access to healthcare and treatment) and is also cost-effective. In fact the UK government spends a far less proportion of the UK's GDP on healthcare than America or France does.
We've had this system since the 1950s and it's never caused us to have a truly 'socialist' government. In fact, the political consensus has shifted to the centre-right in recent years. Using this as an example, I don't think healthcare reform in the US would lead to socialism gaining popularity in the country.