The same routine that helps target a missile from a drone, could help target viruses more effectively, but someone has to make the choice to use it for that purpose.
Things are amoral. The use and creation of things, however, is open to moral and ethical considerations.
Making a radiation therapy machine without the inclusion of safety mechanisms could be considered unethical. Making a computer vision system that tracks targets is amoral (what's a good word for something that is neither ethical nor unethical? I'm not fond of using moral/amoral for this), but applying it to a weapon targeting system hits an ethical gray area (as with any other weapon engineering task).
Like most things, the ethical status of an act is largely related to the intent and foreknowledge. Knowingly creating a system that will (not just could) be used for harming others (physically, emotionally, financially, etc) can be considered unethical. Making a system which is most likely going to be used for harm might be on the unethical side of gray, even if it has a number of potentially positive or neutral uses. Making a system that might be used for harm (consider knives) is probably not unethical, but then the manner of sale and marketing would become important for the calculation.
The same routine that helps target a missile from a drone, could help target viruses more effectively, but someone has to make the choice to use it for that purpose.