Ethical subroutines were a set of subroutines by which artificial lifeforms, like Lieutenant Commander Data or Lore, and holograms, like The Doctor, determined what was ethically right and wrong. (TNG: "Datalore"; VOY: "Equinox, Part II"; Star Trek: Insurrection)
Although Lore was Noonian Soong's first successful android, he had difficulty adapting to the ethical subroutines that Soong created to guide his behavior and interaction with Humans, forcing Soong to begin work on Data instead. This left Lore increasingly bitter when he learned that there was no real difference between the two of them and his inability to adapt actually made him the "inferior" model.
In 2369, Lore disabled Data's ethical subroutines and made him perform dangerous experiments on the Borg and Geordi La Forge. With no moral obligation to his friend, Data no longer cared if he hurt La Forge, with matters being made worse as Lore had managed to work out a way to make Data experience emotions while only allowing him to experience negative ones. This made him bitter and vengeful towards his former friends, as he was only able to focus on their negative emotional impact on him without fully appreciating the good experiences they had shared. Using a Borg interlink transceiver they took from a damaged drone, Captain Jean-Luc Picard and Deanna Troi were able to generate a Kedion pulse that created interference in Data's positronic net and caused it to reset his subroutines to normal; although Data was still only experiencing negative emotions, he was once again able to choose whether or not to act on them. (TNG: "Descent", "Descent, Part II")
When Data sustained damage after discovering evidence of the Son'a attempt to move the Ba'ku, the damage caused his memory loss fail-safe system to activate, putting his ethical subroutines in direct control and leaving him in a state where the only thing he knew was right and wrong. This caused him to automatically act to protect the Ba'ku from the Son'a and the Federation who threatened them – unable even to distinguish between the Federation officers involved in the plan and his friends and colleagues on the Enterprise – until Picard and Worf were able to capture him and shut him down long enough to be repaired. (Star Trek: Insurrection).
Aboard the USS Voyager in 2375, The Doctor developed a feedback loop between his cognitive and ethical subroutines after making a subjective decision which cost the life of a crewmember, Ahni Jetal, having "allowed" her to die when she and Ensign Harry Kim were both injured by the same alien weapon; The Doctor was only able to save one of them in the time available before the weapon's effects would kill them both, and elected to save Kim simply because he knew him better. As a result, he began to have a "mental breakdown" of sorts, berating himself and his abilities as a doctor and blaming himself for Jetal's death, his original programming to make the most pragmatic decision in a difficult situation involving multiple patients at odds with the personality he had developed and the personal connections he had formed since his activation. The crew initially decided to deal with this problem by erasing The Doctor's memory of Jetal and the entire incident, but when the memories resurfaced again after The Doctor discovered evidence of the surgery he had performed on Kim, a conversation with Seven of Nine prompted Janeway to decide to let The Doctor deal with the memories and try to work them out for himself, acknowledging that they couldn't help The Doctor become a person only to treat him as a machine when it was easier. (VOY: "Latent Image")
The crew of the USS Equinox disabled their EMH's ethical subroutines so that he would not object to killing nucleogenic lifeforms to increase the Equinox's warp power. In late 2375, with their EMH "hiding" on Voyager, the Equinox crew also disabled The Doctor's ethical subroutines, and, in his altered state, he extracted information from Seven of Nine's Borg implants. After his ethical subroutines were restored, The Doctor apologized to Seven, but she assured him she bore him no ill will, and offered to help him modify his program so that future attempts to tamper with it would be much more difficult. (VOY: "Equinox", "Equinox, Part II")