You have the right to work, but never to the fruit of the work. You should never engage in action for the sake of reward, nor should you long for inaction.
- Sri Krishna, 2-47
PS: The first sentence is one of top #quotes from the #gita
#Sanskrit:
कर्मण्येवाधिकारस्ते मा फलेषु कदाचन ।
मा कर्मफलहेतुर्भूर्मा ते संगोsस्त्वकर्मणि ॥
#Kannada script:
ಕರ್ಮಣ್ಯೇವಾಧಿಕಾರಸ್ತೇ ಮಾ ಫಲೇಷು ಕದಾಚನ |
ಮಾ ಕರ್ಮಫಲಹೇತುರ್ಭೂರ್ಮಾ ತೇ ಸಂಗೋsಸ್ತ್ವಕರ್ಮಣಿ ||
Expired licence, doctor on duty not qualified to treat newborns, no emergency exits - these are among several lapses that have surfaced after a massive blaze at a children's hospital in the national capital that killed seven newborn babies.
#Duty #DelhiPolice #HospitalFire #MastIndia #MastodonIndians #India @mastodonindians
https://www.ndtv.com/india-news/delhi-childrens-hospital-fire-expired-licence-no-fire-extinguishers-lapses-surface-after-hospital-blaze-5751199
The #Mahabharata is considered one of the major #Sanskrit epics of ancient India, likely composed between the 4th century BCE and 4th century CE.
In contrast, the #Torah is believed to have been compiled between the 6th and 4th centuries BCE.
https://www.bhagavadgitausa.com/JUDAISM%20AND%20HINDUISM.htm?t
Title 8 and Title 18 already are active US laws that require citizenship for one to be allowed to vote. The Speaker of the House is failing all of us by focusing on problems solved in previous centuries instead of the problems we face today.
Call him and demand the bill supporting Ukraine get a vote: +1-202-225-4000
#AI performs
By design #generativeAI is extremely hungry in resources: https://mas.to/@maugendre/111841507744115766
#AI already serves to transform ways of working. You can even industrialize killing: https://techhub.social/@estelle/111505251994013079
Tzahal preferred to use “dumb” bombs to strike the homes of the marked persons. “You don’t want to waste expensive bombs on unimportant people — it’s very expensive for the country and there’s a shortage [of those bombs],” said C., one of the intelligence officers.
If a junior target lived in a building with only a few floors, the army was authorized to kill him and everyone in the building with a dumb bomb.
“We usually carried out the attacks with dumb bombs, and that meant literally destroying the whole house on top of its occupants. But even if an attack is averted, you don’t care — you immediately move on to the next target. Because of the system, the targets never end. You have another 36,000 waiting.”
#AI performs
By design #generativeAI is extremely hungry in resources: https://mas.to/@maugendre/111841507744115766
#AI already serves to transform ways of working. You can even industrialize killing: https://techhub.social/@estelle/111505251994013079
B., the senior officer, claimed that in the current war, “I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added value as a human, apart from being a stamp of approval. It saved a lot of time.”
According to B., a common error occurred “if the [Hamas] target gave [his phone] to his son, his older brother, or just a random man. That person will be bombed in his house with his family. This happened often. These were most of the mistakes caused by Lavender,” B. said.
https://www.972mag.com/lavender-ai-israeli-army-gaza/ @israel @data
“The #protocol was that even if you don’t know for sure that the machine is right, you know that statistically it’s fine. So you go for it,” said a source who used #Lavender.
“It has proven itself,” said B., the senior officer. “There’s something about the statistical approach that sets you to a certain norm and standard. There has been an illogical amount of [bombings] in this operation. This is unparalleled, in my memory. And I have much more trust in a statistical mechanism than a soldier who lost a friend two days ago. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier.”
Another intelligence source said: “In war, there is no time to incriminate every target. So you’re willing to take the margin of error of using artificial intelligence, risking collateral damage and civilians dying, and risking attacking by mistake, and to live with it.”
https://www.972mag.com/lavender-ai-israeli-army-gaza/ @israel @data