Why Aren’t Winter Tires Mandatory In North America?
It is strange that there are no stricter regulations for winter tires in North America when the evidence is so clear that it is a major safety contributor for any type of vehicle when you have winter conditions. Still, there is no law making it mandatory. The law can be such that it is mandatory that if there are winter conditions, you are required to have approved winter tires. This will at least make it up to each one if they want to drive during winter, then they need to change to winter tires if the conditions are such. A...