The U.K.'s Russell Group of universities has released a set of guidelines for the use of AI. "It is important that all students and staff understand the opportunities, limitations and ethical issues associated with the use of these tools and can apply what they have learned as the capabilities of generative AI develop," they write. The wording could use some editing; for example, they write "while ethics codes exist, they may not be embedded within all generative AI tools," which in a descriptive sense probably means "they might not be embedded" but in a statement of principles (which this is) would have the sense of "they must not be embedded". Similarly, when I read "Our universities will develop resources and training opportunities," I wonder whether this is meant is a predictive sense, or a normative sense.
Today: 8 Total: 89 [Share]
] [