Already, this has given rise to a global industry staffed by people like Joe who use their uniquely human faculties to help the machines.Īutomation often unfolds in unexpected ways. The more AI systems are put out into the world to dispense legal advice and medical help, the more edge cases they will encounter and the more humans will be needed to sort them. In 2018, an Uber self-driving test car killed a woman because, though it was programmed to avoid cyclists and pedestrians, it didn’t know what to make of someone walking a bike across the street. These failures, called “edge cases,” can have serious consequences. Machine-learning systems are what researchers call “brittle,” prone to fail when encountering something that isn’t well represented in their training data. You collect as much labeled data as you can get as cheaply as possible to train your model, and if it works, at least in theory, you no longer need the annotators. ![]() Annotation remains a foundational part of making AI, but there is often a sense among engineers that it’s a passing, inconvenient prerequisite to the more glamorous work of building models.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |