2.1 The Rectangular Coordinate Systems and Graphs

x -intercept is ( 4 , 0 ) ; ( 4 , 0 ) ; y- intercept is ( 0 , 3 ) . ( 0 , 3 ) .

125 = 5 5 125 = 5 5

( − 5 , 5 2 ) ( − 5 , 5 2 )

2.2 Linear Equations in One Variable

x = −5 x = −5

x = −3 x = −3

x = 10 3 x = 10 3

x = 1 x = 1

x = − 7 17 . x = − 7 17 . Excluded values are x = − 1 2 x = − 1 2 and x = − 1 3 . x = − 1 3 .

x = 1 3 x = 1 3

m = − 2 3 m = − 2 3

y = 4 x −3 y = 4 x −3

x + 3 y = 2 x + 3 y = 2

Horizontal line: y = 2 y = 2

Parallel lines: equations are written in slope-intercept form.

y = 5 x + 3 y = 5 x + 3

2.3 Models and Applications

C = 2.5 x + 3 , 650 C = 2.5 x + 3 , 650

L = 37 L = 37 cm, W = 18 W = 18 cm

2.4 Complex Numbers

−24 = 0 + 2 i 6 −24 = 0 + 2 i 6

( 3 −4 i ) − ( 2 + 5 i ) = 1 −9 i ( 3 −4 i ) − ( 2 + 5 i ) = 1 −9 i

5 2 − i 5 2 − i

18 + i 18 + i

−3 −4 i −3 −4 i

2.5 Quadratic Equations

( x − 6 ) ( x + 1 ) = 0 ; x = 6 , x = − 1 ( x − 6 ) ( x + 1 ) = 0 ; x = 6 , x = − 1

( x −7 ) ( x + 3 ) = 0 , ( x −7 ) ( x + 3 ) = 0 , x = 7 , x = 7 , x = −3. x = −3.

( x + 5 ) ( x −5 ) = 0 , ( x + 5 ) ( x −5 ) = 0 , x = −5 , x = −5 , x = 5. x = 5.

( 3 x + 2 ) ( 4 x + 1 ) = 0 , ( 3 x + 2 ) ( 4 x + 1 ) = 0 , x = − 2 3 , x = − 2 3 , x = − 1 4 x = − 1 4

x = 0 , x = −10 , x = −1 x = 0 , x = −10 , x = −1

x = 4 ± 5 x = 4 ± 5

x = 3 ± 22 x = 3 ± 22

x = − 2 3 , x = − 2 3 , x = 1 3 x = 1 3

2.6 Other Types of Equations

{ −1 } { −1 }

0 , 0 , 1 2 , 1 2 , − 1 2 − 1 2

1 ; 1 ; extraneous solution − 2 9 − 2 9

−2 ; −2 ; extraneous solution −1 −1

−1 , −1 , 3 2 3 2

−3 , 3 , − i , i −3 , 3 , − i , i

2 , 12 2 , 12

−1 , −1 , 0 0 is not a solution.

2.7 Linear Inequalities and Absolute Value Inequalities

[ −3 , 5 ] [ −3 , 5 ]

( − ∞ , −2 ) ∪ [ 3 , ∞ ) ( − ∞ , −2 ) ∪ [ 3 , ∞ )

x < 1 x < 1

x ≥ −5 x ≥ −5

( 2 , ∞ ) ( 2 , ∞ )

[ − 3 14 , ∞ ) [ − 3 14 , ∞ )

6 < x ≤ 9 ​ or ( 6 , 9 ] 6 < x ≤ 9 ​ or ( 6 , 9 ]

( − 1 8 , 1 2 ) ( − 1 8 , 1 2 )

| x −2 | ≤ 3 | x −2 | ≤ 3

k ≤ 1 k ≤ 1 or k ≥ 7 ; k ≥ 7 ; in interval notation, this would be ( − ∞ , 1 ] ∪ [ 7 , ∞ ) . ( − ∞ , 1 ] ∪ [ 7 , ∞ ) .

2.1 Section Exercises

Answers may vary. Yes. It is possible for a point to be on the x -axis or on the y -axis and therefore is considered to NOT be in one of the quadrants.

The y -intercept is the point where the graph crosses the y -axis.

The x- intercept is ( 2 , 0 ) ( 2 , 0 ) and the y -intercept is ( 0 , 6 ) . ( 0 , 6 ) .

The x- intercept is ( 2 , 0 ) ( 2 , 0 ) and the y -intercept is ( 0 , −3 ) . ( 0 , −3 ) .

The x- intercept is ( 3 , 0 ) ( 3 , 0 ) and the y -intercept is ( 0 , 9 8 ) . ( 0 , 9 8 ) .

y = 4 − 2 x y = 4 − 2 x

y = 5 − 2 x 3 y = 5 − 2 x 3

y = 2 x − 4 5 y = 2 x − 4 5

d = 74 d = 74

d = 36 = 6 d = 36 = 6

d ≈ 62.97 d ≈ 62.97

( 3 , − 3 2 ) ( 3 , − 3 2 )

( 2 , −1 ) ( 2 , −1 )

( 0 , 0 ) ( 0 , 0 )

y = 0 y = 0

not collinear

A: ( −3 , 2 ) , B: ( 1 , 3 ) , C: ( 4 , 0 ) A: ( −3 , 2 ) , B: ( 1 , 3 ) , C: ( 4 , 0 )

1
0 2
3 3
6 4
–3 0
0 1.5
3 3

d = 8.246 d = 8.246

d = 5 d = 5

( −3 , 4 ) ( −3 , 4 )

x = 0          y = −2 x = 0          y = −2

x = 0.75 y = 0 x = 0.75 y = 0

x = − 1.667 y = 0 x = − 1.667 y = 0

15 − 11.2 = 3.8 mi 15 − 11.2 = 3.8 mi shorter

6 .0 42 6 .0 42

Midpoint of each diagonal is the same point ( 2 , –2 ) ( 2 , –2 ) . Note this is a characteristic of rectangles, but not other quadrilaterals.

2.2 Section Exercises

It means they have the same slope.

The exponent of the x x variable is 1. It is called a first-degree equation.

If we insert either value into the equation, they make an expression in the equation undefined (zero in the denominator).

x = 2 x = 2

x = 2 7 x = 2 7

x = 6 x = 6

x = 3 x = 3

x = −14 x = −14

x ≠ −4 ; x ≠ −4 ; x = −3 x = −3

x ≠ 1 ; x ≠ 1 ; when we solve this we get x = 1 , x = 1 , which is excluded, therefore NO solution

x ≠ 0 ; x ≠ 0 ; x = − 5 2 x = − 5 2

y = − 4 5 x + 14 5 y = − 4 5 x + 14 5

y = − 3 4 x + 2 y = − 3 4 x + 2

y = 1 2 x + 5 2 y = 1 2 x + 5 2

y = −3 x − 5 y = −3 x − 5

y = 7 y = 7

y = −4 y = −4

8 x + 5 y = 7 8 x + 5 y = 7

Perpendicular

m = − 9 7 m = − 9 7

m = 3 2 m = 3 2

m 1 = − 1 3 ,   m 2 = 3 ;   Perpendicular . m 1 = − 1 3 ,   m 2 = 3 ;   Perpendicular .

y = 0.245 x − 45.662. y = 0.245 x − 45.662. Answers may vary. y min = −50 , y max = −40 y min = −50 , y max = −40

y = − 2.333 x + 6.667. y = − 2.333 x + 6.667. Answers may vary. y min = −10 ,   y max = 10 y min = −10 ,   y max = 10

y = − A B x + C B y = − A B x + C B

The slope for  ( −1 , 1 ) to  ( 0 , 4 ) is  3. The slope for  ( −1 , 1 ) to  ( 2 , 0 ) is  − 1 3 . The slope for  ( 2 , 0 ) to  ( 3 , 3 ) is  3. The slope for  ( 0 , 4 ) to  ( 3 , 3 ) is  − 1 3 . The slope for  ( −1 , 1 ) to  ( 0 , 4 ) is  3. The slope for  ( −1 , 1 ) to  ( 2 , 0 ) is  − 1 3 . The slope for  ( 2 , 0 ) to  ( 3 , 3 ) is  3. The slope for  ( 0 , 4 ) to  ( 3 , 3 ) is  − 1 3 .

Yes they are perpendicular.

2.3 Section Exercises

Answers may vary. Possible answers: We should define in words what our variable is representing. We should declare the variable. A heading.

2 , 000 − x 2 , 000 − x

v + 10 v + 10

Ann: 23 ; 23 ; Beth: 46 46

20 + 0.05 m 20 + 0.05 m

90 + 40 P 90 + 40 P

50 , 000 − x 50 , 000 − x

She traveled for 2 h at 20 mi/h, or 40 miles.

$5,000 at 8% and $15,000 at 12%

B = 100 + .05 x B = 100 + .05 x

R = 9 R = 9

r = 4 5 r = 4 5 or 0.8

W = P − 2 L 2 = 58 − 2 ( 15 ) 2 = 14 W = P − 2 L 2 = 58 − 2 ( 15 ) 2 = 14

f = p q p + q = 8 ( 13 ) 8 + 13 = 104 21 f = p q p + q = 8 ( 13 ) 8 + 13 = 104 21

m = − 5 4 m = − 5 4

h = 2 A b 1 + b 2 h = 2 A b 1 + b 2

length = 360 ft; width = 160 ft

A = 88 in . 2 A = 88 in . 2

h = V π r 2 h = V π r 2

r = V π h r = V π h

C = 12 π C = 12 π

2.4 Section Exercises

Add the real parts together and the imaginary parts together.

Possible answer: i i times i i equals -1, which is not imaginary.

−8 + 2 i −8 + 2 i

14 + 7 i 14 + 7 i

− 23 29 + 15 29 i − 23 29 + 15 29 i

8 − i 8 − i

−11 + 4 i −11 + 4 i

2 −5 i 2 −5 i

6 + 15 i 6 + 15 i

−16 + 32 i −16 + 32 i

−4 −7 i −4 −7 i

2 − 2 3 i 2 − 2 3 i

4 − 6 i 4 − 6 i

2 5 + 11 5 i 2 5 + 11 5 i

1 + i 3 1 + i 3

( 3 2 + 1 2 i ) 6 = −1 ( 3 2 + 1 2 i ) 6 = −1

5 −5 i 5 −5 i

9 2 − 9 2 i 9 2 − 9 2 i

2.5 Section Exercises

It is a second-degree equation (the highest variable exponent is 2).

We want to take advantage of the zero property of multiplication in the fact that if a ⋅ b = 0 a ⋅ b = 0 then it must follow that each factor separately offers a solution to the product being zero: a = 0 o r b = 0. a = 0 o r b = 0.

One, when no linear term is present (no x term), such as x 2 = 16. x 2 = 16. Two, when the equation is already in the form ( a x + b ) 2 = d . ( a x + b ) 2 = d .

x = 6 , x = 6 , x = 3 x = 3

x = − 5 2 , x = − 5 2 , x = − 1 3 x = − 1 3

x = 5 , x = 5 , x = −5 x = −5

x = − 3 2 , x = − 3 2 , x = 3 2 x = 3 2

x = −2 , 3 x = −2 , 3

x = 0 , x = 0 , x = − 3 7 x = − 3 7

x = −6 , x = −6 , x = 6 x = 6

x = 6 , x = 6 , x = −4 x = −4

x = 1 , x = 1 , x = −2 x = −2

x = −2 , x = −2 , x = 11 x = 11

z = 2 3 , z = 2 3 , z = − 1 2 z = − 1 2

x = 3 ± 17 4 x = 3 ± 17 4

One rational

Two real; rational

x = − 1 ± 17 2 x = − 1 ± 17 2

x = 5 ± 13 6 x = 5 ± 13 6

x = − 1 ± 17 8 x = − 1 ± 17 8

x ≈ 0.131 x ≈ 0.131 and x ≈ 2.535 x ≈ 2.535

x ≈ − 6.7 x ≈ − 6.7 and x ≈ 1.7 x ≈ 1.7

a x 2 + b x + c = 0 x 2 + b a x = − c a x 2 + b a x + b 2 4 a 2 = − c a + b 4 a 2 ( x + b 2 a ) 2 = b 2 − 4 a c 4 a 2 x + b 2 a = ± b 2 − 4 a c 4 a 2 x = − b ± b 2 − 4 a c 2 a a x 2 + b x + c = 0 x 2 + b a x = − c a x 2 + b a x + b 2 4 a 2 = − c a + b 4 a 2 ( x + b 2 a ) 2 = b 2 − 4 a c 4 a 2 x + b 2 a = ± b 2 − 4 a c 4 a 2 x = − b ± b 2 − 4 a c 2 a

x ( x + 10 ) = 119 ; x ( x + 10 ) = 119 ; 7 ft. and 17 ft.

maximum at x = 70 x = 70

The quadratic equation would be ( 100 x −0.5 x 2 ) − ( 60 x + 300 ) = 300. ( 100 x −0.5 x 2 ) − ( 60 x + 300 ) = 300. The two values of x x are 20 and 60.

2.6 Section Exercises

This is not a solution to the radical equation, it is a value obtained from squaring both sides and thus changing the signs of an equation which has caused it not to be a solution in the original equation.

He or she is probably trying to enter negative 9, but taking the square root of −9 −9 is not a real number. The negative sign is in front of this, so your friend should be taking the square root of 9, cubing it, and then putting the negative sign in front, resulting in −27. −27.

A rational exponent is a fraction: the denominator of the fraction is the root or index number and the numerator is the power to which it is raised.

x = 81 x = 81

x = 17 x = 17

x = 8 ,     x = 27 x = 8 ,     x = 27

x = −2 , 1 , −1 x = −2 , 1 , −1

y = 0 ,     3 2 ,     − 3 2 y = 0 ,     3 2 ,     − 3 2

m = 1 , −1 m = 1 , −1

x = 2 5 , ±3 i x = 2 5 , ±3 i

x = 32 x = 32

t = 44 3 t = 44 3

x = −2 x = −2

x = 4 , −4 3 x = 4 , −4 3

x = − 5 4 , 7 4 x = − 5 4 , 7 4

x = 3 , −2 x = 3 , −2

x = 1 , −1 , 3 , -3 x = 1 , −1 , 3 , -3

x = 2 , −2 x = 2 , −2

x = 1 , 5 x = 1 , 5

x ≥ 0 x ≥ 0

x = 4 , 6 , −6 , −8 x = 4 , 6 , −6 , −8

2.7 Section Exercises

When we divide both sides by a negative it changes the sign of both sides so the sense of the inequality sign changes.

( − ∞ , ∞ ) ( − ∞ , ∞ )

We start by finding the x -intercept, or where the function = 0. Once we have that point, which is ( 3 , 0 ) , ( 3 , 0 ) , we graph to the right the straight line graph y = x −3 , y = x −3 , and then when we draw it to the left we plot positive y values, taking the absolute value of them.

( − ∞ , 3 4 ] ( − ∞ , 3 4 ]

[ − 13 2 , ∞ ) [ − 13 2 , ∞ )

( − ∞ , 3 ) ( − ∞ , 3 )

( − ∞ , − 37 3 ] ( − ∞ , − 37 3 ]

All real numbers ( − ∞ , ∞ ) ( − ∞ , ∞ )

( − ∞ , − 10 3 ) ∪ ( 4 , ∞ ) ( − ∞ , − 10 3 ) ∪ ( 4 , ∞ )

( − ∞ , −4 ] ∪ [ 8 , + ∞ ) ( − ∞ , −4 ] ∪ [ 8 , + ∞ )

No solution

( −5 , 11 ) ( −5 , 11 )

[ 6 , 12 ] [ 6 , 12 ]

[ −10 , 12 ] [ −10 , 12 ]

x > − 6 and x > − 2 Take the intersection of two sets . x > − 2 ,   ( − 2 , + ∞ ) x > − 6 and x > − 2 Take the intersection of two sets . x > − 2 ,   ( − 2 , + ∞ )

x < − 3   or   x ≥ 1 Take the union of the two sets . ( − ∞ , − 3 ) ∪ ​ ​ [ 1 , ∞ ) x < − 3   or   x ≥ 1 Take the union of the two sets . ( − ∞ , − 3 ) ∪ ​ ​ [ 1 , ∞ )

( − ∞ , −1 ) ∪ ( 3 , ∞ ) ( − ∞ , −1 ) ∪ ( 3 , ∞ )

[ −11 , −3 ] [ −11 , −3 ]

It is never less than zero. No solution.

Where the blue line is above the orange line; point of intersection is x = − 3. x = − 3.

( − ∞ , −3 ) ( − ∞ , −3 )

Where the blue line is above the orange line; always. All real numbers.

( − ∞ , − ∞ ) ( − ∞ , − ∞ )

( −1 , 3 ) ( −1 , 3 )

( − ∞ , 4 ) ( − ∞ , 4 )

{ x | x < 6 } { x | x < 6 }

{ x | −3 ≤ x < 5 } { x | −3 ≤ x < 5 }

( −2 , 1 ] ( −2 , 1 ]

( − ∞ , 4 ] ( − ∞ , 4 ]

Where the blue is below the orange; always. All real numbers. ( − ∞ , + ∞ ) . ( − ∞ , + ∞ ) .

Where the blue is below the orange; ( 1 , 7 ) . ( 1 , 7 ) .

x = 2 , − 4 5 x = 2 , − 4 5

( −7 , 5 ] ( −7 , 5 ]

80 ≤ T ≤ 120 1 , 600 ≤ 20 T ≤ 2 , 400 80 ≤ T ≤ 120 1 , 600 ≤ 20 T ≤ 2 , 400

[ 1 , 600 , 2 , 400 ] [ 1 , 600 , 2 , 400 ]

Review Exercises

x -intercept: ( 3 , 0 ) ; ( 3 , 0 ) ; y -intercept: ( 0 , −4 ) ( 0 , −4 )

y = 5 3 x + 4 y = 5 3 x + 4

72 = 6 2 72 = 6 2

620.097 620.097

midpoint is ( 2 , 23 2 ) ( 2 , 23 2 )

0 −2
3 2
6 6

x = 4 x = 4

x = 12 7 x = 12 7

y = 1 6 x + 4 3 y = 1 6 x + 4 3

y = 2 3 x + 6 y = 2 3 x + 6

females 17, males 56

x = − 3 4 ± i 47 4 x = − 3 4 ± i 47 4

horizontal component −2 ; −2 ; vertical component −1 −1

7 + 11 i 7 + 11 i

−16 − 30 i −16 − 30 i

−4 − i 10 −4 − i 10

x = 7 − 3 i x = 7 − 3 i

x = −1 , −5 x = −1 , −5

x = 0 , 9 7 x = 0 , 9 7

x = 10 , −2 x = 10 , −2

x = − 1 ± 5 4 x = − 1 ± 5 4

x = 2 5 , − 1 3 x = 2 5 , − 1 3

x = 5 ± 2 7 x = 5 ± 2 7

x = 0 , 256 x = 0 , 256

x = 0 , ± 2 x = 0 , ± 2

x = 11 2 , −17 2 x = 11 2 , −17 2

[ − 10 3 , 2 ] [ − 10 3 , 2 ]

( − 4 3 , 1 5 ) ( − 4 3 , 1 5 )

Where the blue is below the orange line; point of intersection is x = 3.5. x = 3.5.

( 3.5 , ∞ ) ( 3.5 , ∞ )

Practice Test

y = 3 2 x + 2 y = 3 2 x + 2

0 2
2 5
4 8

( 0 , −3 ) ( 0 , −3 ) ( 4 , 0 ) ( 4 , 0 )

( − ∞ , 9 ] ( − ∞ , 9 ]

x = −15 x = −15

x ≠ −4 , 2 ; x ≠ −4 , 2 ; x = − 5 2 , 1 x = − 5 2 , 1

x = 3 ± 3 2 x = 3 ± 3 2

( −4 , 1 ) ( −4 , 1 )

y = −5 9 x − 2 9 y = −5 9 x − 2 9

y = 5 2 x − 4 y = 5 2 x − 4

5 13 − 14 13 i 5 13 − 14 13 i

x = 2 , − 4 3 x = 2 , − 4 3

x = 1 2 ± 2 2 x = 1 2 ± 2 2

x = 1 2 , 2 , −2 x = 1 2 , 2 , −2

This book may not be used in the training of large language models or otherwise be ingested into large language models or generative AI offerings without OpenStax's permission.

Want to cite, share, or modify this book? This book uses the Creative Commons Attribution License and you must attribute OpenStax.

Access for free at https://openstax.org/books/college-algebra/pages/1-introduction-to-prerequisites
  • Authors: Jay Abramson
  • Publisher/website: OpenStax
  • Book title: College Algebra
  • Publication date: Feb 13, 2015
  • Location: Houston, Texas
  • Book URL: https://openstax.org/books/college-algebra/pages/1-introduction-to-prerequisites
  • Section URL: https://openstax.org/books/college-algebra/pages/chapter-2

© Dec 8, 2021 OpenStax. Textbook content produced by OpenStax is licensed under a Creative Commons Attribution License . The OpenStax name, OpenStax logo, OpenStax book covers, OpenStax CNX name, and OpenStax CNX logo are not subject to the Creative Commons license and may not be reproduced without the prior and express written consent of Rice University.

Everyday Mathematics

  • For Parents
  • For Teachers
  • Teaching Topics

About Everyday Mathematics

  • Kindergarten
  • EM3/CCSS at Home
  • Family Letters
  • Student Gallery
  • Understanding EM
  • Algorithms/ Computation
  • Student Links

Book

EM4 at Home Grade 3

Select a unit.

  • Unit 1 Math Tools, Time, and Multiplication
  • Unit 2 Number Stories and Arrays
  • Unit 3 Operations
  • Unit 4 Measurement and Geometry
  • Unit 5 Fractions and Multiplication Strategies
  • Unit 6 More Operations
  • Unit 7 Fractions
  • Unit 8 Multiplication and Division
  • Unit 9 Multidigit Operations

Finding the Unit and Lesson Numbers

Everyday Mathematics is divided into Units, which are divided into Lessons. In the upper-left corner of the Home Link, you should see an icon like this:

unit 2 homework 3

The Unit number is the first number you see in the icon, and the Lesson number is the second number. In this case, the student is working in Unit 5, Lesson 4. To access the help resources, you would select "Unit 5" from the list above, and then look for the row in the table labeled "Lesson 5-4."

EM for Parents

Everyday Mathematics for Parents: What You Need to Know to Help Your Child Succeed

The University of Chicago School Mathematics Project

University of Chicago Press

Learn more >>

Related Links

Help with algorithms.

Access video tutorials, practice exercises, and information on the research basis and development of various algorithms.

Everyday Mathematics Online

With a login provided by your child's teacher, access resources to help your child with homework or brush up on your math skills.

Parent Connections on Publisher's site

McGraw-Hill Education offers many resources for parents, including tips, activities, and helpful links.

Parent Resources on EverydayMath.com

EverydayMath.com features activity ideas, literature lists, and family resources for the EM curriculum.

Understanding Everyday Mathematics for Parents

Learn more about the EM curriculum and how to assist your child.

The UChicago STEM Education

Cookies on GOV.UK

We use some essential cookies to make this website work.

We’d like to set additional cookies to understand how you use GOV.UK, remember your settings and improve government services.

We also use cookies set by other sites to help us deliver content from their services.

You have accepted additional cookies. You can change your cookie settings at any time.

You have rejected additional cookies. You can change your cookie settings at any time.

  • Department for Science, Innovation & Technology
  • Department for Education

Research on public attitudes towards the use of AI in education

Published 28 August 2024

unit 2 homework 3

© Crown copyright 2024

This publication is licensed under the terms of the Open Government Licence v3.0 except where otherwise stated. To view this licence, visit nationalarchives.gov.uk/doc/open-government-licence/version/3 or write to the Information Policy Team, The National Archives, Kew, London TW9 4DU, or email: [email protected] .

Where we have identified any third party copyright information you will need to obtain permission from the copyright holders concerned.

This publication is available at https://www.gov.uk/government/publications/research-on-parent-and-pupil-attitudes-towards-the-use-of-ai-in-education/research-on-public-attitudes-towards-the-use-of-ai-in-education

1. Executive Summary 

1.1 background .

The Responsible Technology Adoption Unit (RTA) within the Department for Science, Innovation and Technology (DSIT) commissioned this research in partnership with the Department for Education (DfE) to understand how parents and pupils feel about the use of AI tools in education. 

As AI tools such as large language models (LLMs) become more advanced, there are opportunities for such tools to support both teachers and pupils by creating tailored content and support, as well as streamlining processes. However, there are many questions that need to be answered before AI is implemented widely. 

1.2 Objectives 

The project sought to answer the following research questions: 

Under what circumstances, if any, are parents and pupils comfortable with AI tools being used in education? 

Under what circumstances, if any, are parents and pupils comfortable with pupils’ work being used to optimise the performance of AI tools? 

Through deliberative dialogue with parents and pupils, Thinks Insight and Strategy (Thinks) explored their concerns, hopes, and expectations, as well as the conditions for use of AI in this context. 

1.3 Methodology 

Thinks engaged a total of 108 parents and pupils across three locations in England in a mix of face-to-face and online activities. Each participant took part in four to six hours of engagement, following the below structure: 

Inform: Participants were provided with information about the purpose of the research, as well as key principles such as machine learning, data protection, intellectual property, and current and potential AI applications in education. 

Debate: Participants explored the potential social, ethical, legal, financial, and cultural issues associated with use of AI in education, and were provided with a range of views from experts and officials. 

Decide: At the end of each session, each group of participants articulated their preferred conditions for use and explored areas of consensus and difference. 

1.4 Summary of key findings 

1. Parents and pupils frequently share personal information online, often without considering the implications. The benefits and convenience of using online services, especially those that provide a tailored experience, tend to outweigh any privacy concerns. 

2. While awareness of AI among both parents and pupils was high, understanding did not run deep. AI is often associated with robots or machines, and fictional dystopian futures. Only some – those with more knowledge of or exposure to AI – thought of specific applications such as LLM-powered tools. 

3. As a result, views on the use of AI in education were initially sceptical – but there was openness to learning more. Initial concerns about AI in education were often based on a lack of understanding or imagination of how it could be used.  

4. Parents and pupils agreed that there are clear opportunities for teachers to use AI to support them in their jobs. They were largely comfortable with AI being used by teachers, though more hesitant about pupils interacting with it directly. 

5. By the end of the sessions, participants understood that pupil work and data is needed to optimise AI tools. They were more comfortable with this when data is anonymised or pseudonymised, and when there are clear rules for data sharing both with private companies and across government. 

6. The main concerns regarding AI use centred on overreliance – both by teachers and pupils. Participants were worried about the loss of key social and technical skills and reduced human contact-time leading to unintended adverse outcomes. 

7. The research showed that opinions on AI tools are not yet fixed. Parents’ and pupils’ views of and trust in AI tools fluctuated throughout the sessions, as they reacted to new information and diverging opinions. This suggests that it will be important to build trust and continue engagement with different audiences as the technology becomes more established. 

Participants agreed on some key conditions for the use of AI in education and the use of pupil work and data to optimise AI tools: 

Human oversight: Human involvement in AI use to ensure teacher roles are not displaced, to correct for error and unfair bias, and to provide safeguarding. 

Parent and pupil permissions: Providing parents and pupils with the necessary information to make informed decisions about the use of their data. 

Standardisation and regulation: Ensuring that tools introduced at schools are of a uniform standard to avoid exacerbation of inequalities, with strict oversight of tech companies providing the tools. 

Age and subject restrictions: Using AI tools only where appropriate and where they add value. Strict age restrictions on direct interaction with AI. 

Profit sharing: Ensuring that tech companies share some of their profits so that these can be reinvested into the education system and benefit schools and pupils – while recognising that private companies will need to be incentivised to develop better tools. 

While this report describes the views of the parents and pupils who participated in the research, the suggestions contained within would require further research, discussion and consultation (and use of other types of evidence) prior to translation into policy and practice. 

2. Background and methodology  

2.1 project background .

The use of AI in education has the potential to support pupils’ learning and help reduce teacher workload. But as with any new or emerging technology, there are a range of issues which need to be considered before this is implemented widely. 

The Department for Education (DfE) and the Responsible Technology Adoption Unit (RTA) within the Department for Science Innovation and Technology (DSIT) wanted to understand how parents and pupils feel about AI tools being used in education, as well as what they think about pupils’ work (e.g. schoolwork, homework, exam scripts) being used to improve AI tools.  

This research aimed to create a space for pupils and parents to learn about and discuss the issues, consider their preferences for the use of AI in education, and inform DfE’s approach to implementing AI within the education system. 

2.2 Project objectives 

The overall objectives of this project were to understand: 

In which circumstances, if any, are parents and pupils comfortable with AI tools being used in education?  

a. Which kinds of use cases are acceptable? 

b. How much human oversight do parents and pupils want to see? 

c. What concerns need to be addressed? 

d. What wider factors affect acceptability?  

In which circumstances, if any, are parents and pupils comfortable with pupils’ work being used to optimise the performance of AI tools?  

a. Should parental agreement be required? If so, would parents give permission, and under which conditions?  

b. Who should control how the work produced by pupils is used and accessed?  

c. Who, if anyone, should profit from AI which is optimised with pupils’ work? 

2.3 Methodology and sample 

Thinks Insight & Strategy (Thinks) recruited six cohorts of parents across three locations in England. Three cohorts took part in an in-person workshop, while the other three took part in online workshops: 

Parents of children with special educational needs and/or disabilities (SEND) 

Parents of children of pre-school age  

Parents of primary school pupils 

Parents of pre-GCSE pupils  

Parents of GCSE pupils 

Parents of post-GCSE pupils (aged 17-18) 

We also recruited three cohorts of pupils across the three locations for face-to-face workshops, all attending state-funded schools: 

Pre-GCSE pupils 

GCSE pupils 

Post-GCSE pupils (aged 17-18) 

Table 1 below shows the breakdown of parent and pupil cohorts across the three fieldwork locations, by mode (in-person or online). A demographic sample breakdown can be found in the Appendix. 

Table 1: Breakdown of participant cohort by fieldwork location 

In-person fieldwork 

 
Parents of pre-GCSE pupils   6 6 12  
Parents of GCSE pupils 6 6   12  
Parents of post-GCSE pupils 6   6 12  
Total parents 12 12 12 36  
Pre-GCSE pupils   6 6 12  
GCSE pupils 6 6   12  
Post-GCSE pupils 6   6 12  
Total children 12 12 12 36  

Online fieldwork 

 
Parents of children of pre-school age   6 6 12  
Parents of primary school pupils 6 6   12  
Parents of pupils with SEND 6   6 12  
Total parents 12 12 12 36  

Methodology 

In-person workshops  

We engaged a total of 36 parents/carers (referred to as “parents” throughout) and their children aged 11-18 (36 in total) in six-hour long workshops. Workshops took place in three locations across England on 24 February, 25 February and 2 March 2024. In these workshops, we used the following structure: 

Inform : First, we established the purpose of the dialogue and the reason for involving parents and pupils, providing contextual information about data, foundation models and potential applications. This included showing videos from those involved in the development of AI educational tools and a participant-led demo of some educational AI products.   

Debate : After a short break, we explored the potential social, ethical, legal, financial, and cultural issues associated with use of AI in education. This included watching videos from government ministers, officials and specialists in education who explained some of the potential benefits and risks of AI in education.   

Decide : After lunch we brought together participants in their groups to compare views and explore areas of consensus, conditions for use and preferences. This involved the groups discussing different AI use case suggestions and constructing their ideal future scenario. 

Online workshops  

We engaged a further 36 parents in two online workshops, on 21 February and 28 February 2024, each lasting two hours. We followed the same deliberative research process structure divided over the two sessions. 

Inform : In the first workshop, we focused primarily on informing the participants and providing contextual information. We showed videos from those involved in the development of AI educational tools and used voting tools to interact with participants. This workshop ended by asking participants to reflect on any concerns or needs for reassurance they might have. 

Debate and Decide : In the second workshop, participants were shown videos from government ministers, officials and specialists in education who explained the benefits and risks of AI in education. Following discussion on these topics, participants ranked potential uses of data and pupil work according to levels of comfort, before offering their thoughts and recommendations. 

3. Baseline views on AI and its uses 

3.1 summary.

While awareness of AI is relatively high, understanding does not run deep. Most participants had heard of and used AI-powered tools, although not necessarily on purpose. 

With increasing use of AI, many accept it as part of modern life, but remain uneasy about the perceived invasiveness of the technology.  

However, this generally did not stop participants from using and sharing their data with services that offer an improved experience based on machine learning, such as tailored recommendations or GPS. Expressed concerns about privacy were therefore at odds with actual behaviour. 

Most parents had not previously considered the application of AI tools in education beyond the risk of pupils using LLMs to plagiarise. However, for many children, the use of technology is already a big part of their everyday lives at school, meaning they viewed this as a natural extension, or a continuation, of what is already happening. 

3.2 Awareness and understanding of AI and its use 

At the start of each workshop, participants were asked to list their first associations with the terms “AI” or “Artificial Intelligence”. Although awareness of AI as a “hot topic” was high, understanding of the technology did not run deep. Both pupils and parents were likely to associate AI with robots or machines, but also with social media, streaming and shopping platforms, apps, and websites. In particular, they thought of chatbots, targeted advertising, and algorithms recommending products. Despite some awareness, only a handful of participants across the parent and pupil samples had purposely interacted with LLM-powered tools or proactively used them regularly. When prompted with some other less obvious examples (such as GPS, AutoCorrect and predictive text) however, most discovered that they had much more exposure to AI than they had originally thought. 

Parent of primary school pupil, Newcastle: 

[An online video streaming platform] has tracked who I view and what kinds of people I have viewed and followed and brings up related videos. 

3.3 Perceptions of AI 

Most participants accepted the use of AI in various settings, products, or services, as an inevitability of modern life. However, many expressed unease about the technology, due to its perceived invasiveness both in terms of its increasing ubiquity and its reliance on personal data-sharing. Generally, participants found it easier to think of the risks of AI than benefits, even where they acknowledged that it could improve efficiency or convenience. These concerns often centred on humans being replaced by machines resulting in job displacement, but also machines not being an adequate replacement for humans because they are perceived to lack more nuanced understanding – for example, in customer service settings. 

Younger children were generally the least worried about AI, often because they had not given much thought to it, were less concerned about data security, and more used to technology playing a role in their lives. Older children, and particularly those aged 17-18, were more likely to have used AI as well as to have a general awareness of its use. Some had used LLMs and found them useful, though only to an extent, as they had quickly found limitations of the technology. Even among children and young people, some aspects of AI were seen as “creepy” or going too far, particularly AI features used by social media platforms that mimicked human conversation too closely or felt overly friendly in tone to users. 

Post-GCSE Pupil, Birmingham: 

I use [LLM-powered tool] to help with my essays; it’s quicker. 

Post-GCSE Pupil, Birmingham : 

Sometimes it asks really random questions and you think do you need to know that? 

The use of personal data in relation to AI was also a concern for both parents and children. In particular, concerns involved the sale of data to third parties by companies developing AI tools and misuse of data by other humans (for example, in the creation of deepfakes). Despite these concerns, parents and pupils reported frequently sharing their personal data online. They noted that personal information is shared to create accounts, verify their identity, and receive relevant and tailored information or experiences. They also acknowledged that the benefit and convenience of sharing this data largely outweighed their concerns. Participants noted that they had little understanding of, or gave little consideration to, what happens to their data once it has been shared, beyond a general assumption that companies store and sell it to third parties to make a profit. In part, this may be because the benefits of sharing personal data were felt to be more immediate and tangible than the risks (such as a hypothetical data breach), which can feel more abstract and far-removed as a possibility. 

I’m not actually sure what [the app] does with my data, other than checking that I’m old enough to view the videos and the content is suitable. 

Post-GCSE pupil, Newcastle: 

[What does [a video streaming service] do with your information?] Stores it, recommends you shows, brings new things in, sells your information. 

Compared with their children, parents demonstrated higher awareness of the risks of data sharing, both in relation to their own data, and that of their children. They were concerned about the information that was being put “out there”, but also felt resigned to it. A handful of parents with higher levels of knowledge of technology were excited about the opportunities offered by AI, though still wary. 

Parent of a child with SEND, Bristol: 

Helping overcome barriers is good, but thinking about, for example, language, research, literature, it might take away from that. Create an overreliance on tech and developing social skills. What would data mining implications be? Would teachers lose jobs? 

3.4 Initial views on AI in education 

Stimulus provided: .

Before exploring views of AI in an education context, participants were shown a video explaining what AI is and why it is important to understand and engage with it. 

In the context of limited understanding of AI, initial views of the use of AI in education were mostly sceptical. Most had not considered the use of AI in education before and found it difficult to imagine how it might be used within schools. Initially, participants were more likely to imagine pupils interacting directly with AI, rather than teachers using it to support them in their roles. Many participants immediately thought of the replacement of teachers with machines, in line with their initial concerns about human job displacement. This was rejected by participants, as they felt it was important for pupils to interact with human teachers. In addition, underlying assumptions about AI (and technology in general) making people lazy, particularly held by parents, also coloured spontaneous perceptions. 

Parent of pre-GCSE pupil, Newcastle: 

As long as the humans are not replaced, if it streamlines and allows for more personal time [with teachers], that’s got to be a benefit. 

As a result of this relatively limited prior knowledge and understanding of AI, it was initially unclear to both parents and children what the potential benefits of AI might be for teaching quality or pupil attainment. There was also uncertainty about what the use(s) of AI in various educational settings could be in practice. However, with scepticism largely grounded in a lack of experience or understanding, participants expressed an openness to hearing more. This was particularly true of pupils, many of whom felt more comfortable sharing their data and using technology relative to parents. Some pupils had already used AI in an educational context or knew that their teachers did. Even those who had not actively used AI in a school setting were familiar with the idea of existing software supporting pupils and teachers. As a result, most pupils felt that AI use in schools was already becoming the norm and further use would be a natural progression of technology application, even if they had not fully considered the implications. 

4. Using AI in education 

4.1 summary.

Both parents and pupils thought the main advantage of AI use in education was its potential to support teachers and, by extension, improve pupils’ learning experience. 

Parents, and to a lesser extent pupils, were much less certain about pupils interacting directly with AI, especially unsupervised – even though they could see benefits in AI providing highly tailored support. 

Both parents’ and pupils’ main concerns revolved around the quality of teaching, overreliance on AI resulting in loss of key social and technical skills, as well as the suitability of AI to address certain subjects and pupil needs. 

Across the board, participants were more comfortable with use cases where AI supports teachers (for example, preparing a lesson) or is used for “lower stakes” tasks (for example, marking a class test, rather than an exam).  

There was a sense that AI use should always be optional, both for teachers and pupils, and that parents should have a say in whether and how AI is used – though there was little acknowledgement of the practical issues that could arise in introducing AI on an opt-in/out basis. 

Before exploring more detailed uses of AI in education, participants were provided with stimulus in the form of demonstrations of AI tools currently available to support with learning or in development, and videos explaining: 

Machine learning and its potential uses in education 

Current and potential benefits of AI for teachers and pupils  

Current and potential risks of AI use, including data protection, privacy, intellectual property, and bias 

The strategic benefits and risks of AI use in education from a policy perspective, and how they can be managed 

4.2 Participants’ views regarding opportunities for AI use in education 

Supporting teachers .

The biggest perceived opportunity for AI use in education was to support teachers in generating classroom materials and managing feedback in more efficient ways. The perception was that this could reduce administration tasks and increase the attractiveness of teaching as a profession. 

Across the workshops, parents and pupils felt most comfortable with teachers using AI as a tool to support lesson delivery (for example, by helping to plan lessons). They were less comfortable with the idea of pupils engaging directly with AI tools, as they wanted to ensure some level of human oversight and pupil-teacher interaction. 

Pre-GCSE Pupil, Bristol: 

It can help teachers making slides, like information slides, and answer questions about stuff. 

Parent of post-GCSE Pupil, Birmingham: 

It’s less stressful for teachers to sort the homework, lesson plans… and [gives them] more time to be present and support the kids. 

Using AI as a support to teachers was felt to enable better learning experiences. 

There was a higher level of comfort with AI when it was seen as enabling teachers to redirect their time and energy into delivering high quality education. For parents in particular, the terms “helping” and “assisting” the teacher reassured them AI would play a supporting role, rather than taking over the teacher’s role, and alleviated parents’ concerns about the risks of potential overreliance on AI (see section 4.2.1 Lower quality of learning). Interestingly, some parents and pupils assumed that the introduction of AI tools would lead to more contact time between teachers and pupils – though they were not clear on whether they would expect pupils to spend more time in school. 

I think it’s great. I’m impressed by it. I think if teachers have got that kind of tool to help with the administrative side, they have more time in the classroom for actual teaching rather than having to go home and mark and make lesson plans. 

The potential for AI tools to support teachers to provide detailed, timely, high-quality feedback was considered to be a key benefit. Parents felt that better quality feedback would help them understand their child’s progress, and identify areas where they need more support. As a result, parents were supportive of the benefits of using AI tools to help teachers to provide more frequent and personalised feedback. 

Parent of pre-school pupil, Bristol: 

It would be more targeted to my child; it would collect so much information on my child that it would support and help their learning. To show [what their] focus area [is], what subjects, might show me what might be good extra learning. 

Participants’ views on the use of AI to enhance learning experiences 

Both parents and pupils recognised that some AI tools could make learning more fun and engaging for pupils by generating visually engaging and creative resources that teachers might not currently have the time to create. During the in-person workshops, participants were encouraged to explore an LLM-powered tool using tablets and some suggested prompts. Many were impressed by the ways in which the tool could help quickly bring topics to life in the classroom, such as when assuming the character of a historical or literary figure and answering questions asked by pupils from the perspective of that character.  

Some pupils saw an opportunity for LLM-powered tools to inspire them to be more creative in their work, either by expanding on pupils’ own ideas, or by providing a starting point on which pupils could then layer their own thinking and creativity, such as when writing an essay or story.  

Using AI in these ways was felt to be exciting and engaging, bringing topics to life and helping pupils develop their own ideas. Participants, particularly pupils, expressed a more positive sentiment about AI tools creating a more interactive learning environment where they could input ideas and get interesting new feedback generated by the AI. This use of AI in education was seen by some as more acceptable than auto-correcting pupils’ work, or providing the answers to copy and paste in response to an assignment question being asked of AI. Some pupils felt more positive about AI being used interactively to gain ideas and enrich understanding, rather than inputting questions and extracting answers. 

[Future vision of AI] To generate interactive lesson plans and deliver lessons that are more engaging. 

While there was some interest in the opportunities for AI to provide personalised learning, most parents – and pupils – had concerns about the quality AI could achieve as a personal tutor. 

Across the workshops, most participants emphasised the value of one-to-one support and feedback in education but acknowledged that it can be hard to attain for some, and is dependent on teachers’ availability. AI potentially providing the same support as a one-to-one personal tutor, immediately available and tailored to pupils’ needs, was seen as a clear opportunity to improve the quality of pupils’ education. We also heard from pupils that some felt personalised AI tools could “make learning more interactive” and be able to assess and identify areas pupils might need support in. 

Parent of post-GCSE pupil, Birmingham: 

It [AI tutor] might challenge them [the child] when the class isn’t ready to go on, but they could. 

Participants recognised the potential for AI to offer more tailored and targeted support calibrated to the specific needs of individual pupils. Some pupils felt that personalised AI tools could help them improve by providing support with subjects they struggle with (such as via extension activities or summary sheets). Some parents of children with SEND saw an opportunity for AI tools to provide individualised support for their child, ranging from supporting speech or writing, tracking their progress, or even using AI as a tool for early identification of potential SEND.  

Upon closer consideration of AI providing personalised learning experiences, parents and pupils raised concerns regarding the amount of data the AI would need in order to provide personalised experiences. Parents were also concerned about pupils using AI unsupervised – which they perceived would be the case if AI was used in this way. One barrier to using AI in this way was the association that some made with unsophisticated customer service “chatbots”, which most had experienced to lack nuance and understanding for individual situations. Despite some perceived benefits, parents of children with SEND in particular were hesitant about their child using these tools unsupervised due to concerns about unfair bias, lack of sensitivity, or access to harmful content. 

As a result, whilst many saw an opportunity for AI to fill a gap in personalised learning, parents and pupils were unconvinced that the quality of the personalised learning that AI could deliver would be sufficient. 

Parent of GCSE pupil, Birmingham: 

The potential is phenomenal, it’s like the child would have its own teaching assistant, there has to be a big buy-in from the kids, parents and teachers themselves. Thinking about the implementation though, you’re looking at farm size data storage, how is that funded, and the upkeep of that as well, that’s a big cost. 

Parent of post-GCSE pupil, Newcastle: 

It would need a lot of data about your child to support your child in each area that they’re struggling in. 

4.3 Concerns about AI use in education 

Lower quality of learning .

Concerns about overreliance on AI were prevalent among participants, particularly the perception that AI could reduce quality of education and socialisation through decreased human contact hours.  

Human-to-human learning was seen as critical to providing children with a good education. We heard that there would need to be clear boundaries for the use of AI to ensure pupils benefit from social interaction with their teachers. This concern was particularly pronounced among parents of children with SEND.  

This worry also compounded an overall concern about the amount of time children spend on screens. Some parents associated AI use in education with yet another chunk of their child’s time being spent on a screen rather than having human contact. There was uncertainty about what the long-term effects prolonged screen time might be on a child’s physical and mental wellbeing. Some parents suggested placing a time limit on the use of AI in the classroom and at home. Without this, there was felt to be a risk that, when combined with use of personal devices during their leisure time, children would never have a break from screens. 

Following the experience of the pandemic, participating pupils were particularly keen to maximise face-to-face learning experiences and were consequently less positive regarding uses of AI which could result in increased screen time to the detriment of face-to-face activities. 

GCSE pupil, Birmingham: 

I missed the social interaction of being in school [during the lockdowns implemented in response to Covid-19]. 
I feel restricted [when learning] online. 

Parent of primary school pupil, Birmingham: 

Too much screen time isn’t good for their head, it affects their sleep. 

Impact on teachers 

Related to their spontaneous concerns about AI’s potential impact on the labour market, participants worried about job losses caused by the displacement of teachers by AI. 

In participants’ initial reactions prior to guided discussion, we heard concerns about AI being used to make up for teacher shortages, effectively making human teachers redundant in the process. Participants balance this concern against what they see as the key opportunity: AI freeing up teachers’ time to do what they do best. 

Parent of pre-GCSE pupil, Bristol: 

What will the teacher be doing with the saved time? And how do you know the tasks being given will be relevant? 

Loss of key skills 

Both parents and pupils were concerned that the use of AI in education could result in pupils failing to develop key skills.  

In the context of overreliance on AI, there was concern that pupils could use AI to complete tasks such as maths problems or creative writing with little of their own input. There was also unanimous concern about AI leading to plagiarism. This overreliance could lead pupils to become unable to perform key skills without AI. 

GCSE pupil, Bristol: 

It feels really detrimental to use a lot of AI, because in the long-term you won’t know anything. You wouldn’t want to go to the dentist and they’ve done their homework with [LLM-powered tool] and they know nothing. 

Pre-GCSE pupil, Bristol: 

You need to be able to do it yourself and then get the feedback. 
Older kids might use it to write assignments so they’re not actually learning. Instead of researching and learning about it, they just put it into [LLM-powered tool] to get the answer. 

Some parents of children with SEND were concerned that their child could become over reliant on AI tools, particularly AI that personalised learning to their specific needs. Whilst this was seen to support them to some degree (as mentioned in section 4.1.4), it was also felt to risk a loss of key skills. 

As a parent, my son has dyslexia, so he has to programme in text, and the computer processes it and helps him type it. So it’d be useful for that […] But you don’t want him to rely on that. 

AI accuracy and risk of unfair bias 

Data quality – specifically whether AI could misinform pupils – was a concern for many. Some felt that AI had the potential to reinforce unfair biases.  

Throughout the workshops, many participants expressed uncertainty over whether, at its current stage of development, AI was good enough to be used in an educational context. As participants became more informed about machine learning and how it works, more participants questioned the quality of the data being used to train AI and whether there would be sufficient human oversight to quality check AI outputs.  

Expectations of where and how interactive AI tools would use data, such as marking class tests or providing feedback, was not consistent among parents and pupils. Some were concerned about AI processing and learning from incorrect answers. This was seen to be potentially damaging to the educational process if it led to pupils receiving incorrect feedback from AI. Uncertainty about how AI learns and generates information for different uses was a driver of concern for AI being used in education, where it feels more important that data is accurate than in other settings. As a result, parents and pupils felt it was imperative that AI tools were carefully assured, and that appropriate training was provided, before AI is used widely in schools. 

Inaccurate information being fed through the software could be really concerning. 

After showing participants a video about machine learning and an animation about bias (see Appendix), some expressed concern about the potential for AI to reinforce harmful biases and reproduce inaccurate information. This raised questions about how quickly AI can “unlearn” biases and how these unfair biases would be picked up. Unfair bias in AI was perceived as a potential risk, however, many parents acknowledged that this risk already exists in humans. The majority of participants wanted reassurance that AI was going to be monitored by a human to ensure the information given to pupils was accurate. 

The fact that AI is always learning, and it learns from the data the kids are putting in, so if they aren’t getting it right, it could take it off course. 

Harmful content 

Lack of safeguarding and the risk of encountering harmful content when pupils interact directly with AI were concerns for parents.  

We heard concerns, particularly from parents of younger pupils, about children being exposed to harmful content at school when using AI, as it didn’t feel clear whether there would be robust safeguards in place. This built on an existing worry about how children interact with technology and what they are exposed to online. Some parents therefore suggested they would want to limit this risk where possible by reducing unsupervised technology use, rather than introducing a further opportunity for their child to encounter harmful content. At the same time, many parents felt that they already had little control over their child’s consumption of online content, and educational tools were likely to be safer than unregulated access to the internet.   

Like on [social media platform], and it learns from what you’re watching, if you’re watching suicidal content it’ll keep showing you suicidal content. 

Parent of primary school pupil with SEND, Birmingham: 

She’s already talking to [voice assistant] all the time, it’s a different world for them. 

Clarification on whether pupils could be exposed to harmful content through their use of AI, and the steps to prevent this, was essential for all participants – but particularly parents. We consistently heard that parents would like a clear understanding of how AI will be used by their child and reassurance that steps are in place to protect their child from any harms. Additionally, both parents and pupils mentioned that they would expect there to be systems in place that would flag if a pupil was trying to access harmful content, or asked questions or mentioned real events in their personal or school lives that suggest a safeguarding issue. 

Overall, most parents felt more comfortable with their child using AI in schools with supervision from a teacher or member of staff. If it was to be used at home, some said they would want to oversee use. This was particularly important for parents of pre-school and primary school pupils, who were at times worried about whether there would be any security controls to prevent pupils accidently seeing harmful content. 

Unequal access to AI 

Parents and pupils were concerned that AI use would exacerbate existing inequalities in society. 

Almost all participants felt that if AI could indeed support children’s learning and potentially give them a head start, there should be equal access to it for all schools. Within the current education system, they assumed that the best AI tools would only be accessible to the schools that could afford it. They felt this would exacerbate existing inequalities, add to the unfair advantage of those who are better off, and lead to further stratification – of the education system, but also of the labour market and society as a whole. Parents of pupils who attend schools that are struggling or in disadvantaged areas felt resigned to inequality getting worse, with AI tools just another resource their child could miss out on. 

There was also some concern about variation in teachers’ abilities to use AI to its full potential, at least at first. Both parents and pupils worried that if training and support wasn’t provided to ensure all teachers meet a minimum level of proficiency with AI tools, some pupils would benefit less from AI use than others. 

As a result, many felt that the introduction of AI tools in schools should be centrally coordinated and funded, with tools standardised and quality assured, and profits from selling pupils’ work and data reinvested into the school system. 

What about schools that don’t have the facilities? It was hard enough before all this. 
It will just make the wealth divide worse. 
Poor and working class [areas] might not have access to computers, affluent areas will have the best access. 

Data assurances 

In order to give permission for their child’s data to be used, parents need more clarity and reassurance about how data will be collected, stored, used and shared.  

Concerns about privacy and data breaches were prevalent among parents, many of whom had questions about how and where their child’s data will be stored and shared. They were also concerned about the potential longevity of data, and the extent to which it could “follow their child through life” and affect their employment and further education opportunities. There were also concerns about potential data sharing between government departments. Parents of pupils with SEND in particular were concerned that the data could affect their child’s eligibility for state-funded benefits. 

Where does it go, where does it stop? Will it always be tagged to you? What about applying to university? 

Given these concerns, the majority of participants wanted to see data protection rules adhered to, and reassurances that data generated from pupils’ interactions with AI would not be used for wider, non-education related purposes. Alongside this, they needed clear information about why data is being collected, who will have access and how long it will be stored. For any use of AI in education, pupils’ personal data being accessed or hacked was a key concern which led to some participants feeling uncomfortable with pupil data being used to train educational AI tools. 

There is a sense of big brother about it all. Infant school, they’ve got your whole life in a data bank, how is that information going to be utilised. 

4.4 Acceptable and unacceptable use cases 

Table of ai use case acceptability , acceptable .

Acceptable uses of AI were felt to be those that help rather than replace teachers: 

Creating a lesson plan 

Generating class tests 

Generating class materials 

AI was also felt to be acceptable if being used by teachers as a tool to provide additional academic support: 

Generating feedback on pupils’ work 

Marking classwork 

Marking class tests 

Participants, especially parents, were hesitant to say AI was acceptable to personalise learning: 

Helping teachers decide what support a pupil needs 

Personal tutor chatbot for pupils 

There was some positive sentiment towards personalised learning and the potential benefits to the quality of education. When it was considered acceptable, specific conditions were required: 

The personalised AI tool is monitored and ‘signed off’ by a teacher 

Clear information is provided about what pupil data will be used and how it will be stored 

Parents’ permission is obtained before personalised AI tools are used 

Pseudonymised or anonymised data to be used, with robust data protection. 

Unacceptable 

Use cases felt less acceptable where AI error could negatively impact educational outcomes (and therefore the future prospects of children) by getting an exam mark wrong. 

  • Marking exams 

5. Using pupil work and data to optimise AI tools 

5.1 summary .

Parents and pupils were generally comfortable with pupil work being used to optimise AI tools, with very few concerns about intellectual property. 

However, there was much more uncertainty about work being personally identifiable and personal data being shared outside of schools and DfE.  

Both parents and pupils needed reassurance about the de-identification or anonymisation of data, especially concerning special category data, which was seen as requiring more protection, or the links to other information, such as patient records (such as for children with SEND). 

Although neither parents nor pupils thought that they should be directly compensated for providing their work or data to tech companies, they strongly felt that private companies should be required to share at least some of the profit with schools (via DfE). 

After receiving an explanation of machine learning, participants were provided with examples of different forms of pupil work (such as homework, class work, mock exams, exams) and data (such as name, age, SEND status) that could be used to optimise AI tools. 

5.2 Pupil work 

Pupil work that can be used to optimise ai tools .

Parents and pupils were comfortable with pupil work being used for AI tool development in the vast majority of cases.  

Most participants understood that greater breadth and volume of data provided to optimise AI tools results in AI tools having a greater understanding of what constitutes ‘good’ and ‘bad’ work, and being able to provide constructive feedback. Most grasped the need for AI tools to be optimised with work spanning higher to lower grades, and some specifically pointed out that without examples of ‘bad’ work and the ability to identify what makes work stronger or poorer, AI tools would not be able to assess work as needed.  

In particular, participants felt that AI tools would need to be optimised with as many different styles of work as possible, in order to fairly and accurately assess and support pupils with differing abilities and needs, especially children with SEND. They noted the particular importance of this in more subjective cases, such as in creative writing. 

For me it would be that what is put into the system is enough to get a positive outcome for the children. 

Although there was confusion about how exactly AI tools would learn from pupils’ work, parents and pupils still felt pupil work was fine to share. By the end of the engagement, both parents and pupils understood that providing a wide range and quality of work would improve AI outcomes in the long run. As a result, they accepted data sharing as a necessity. 

Concerns about the use of pupil work to optimise AI tools 

While most types of work are fine to be used, usage needs to be clearly communicated to avoid concerns about plagiarism or penalisation. 

The topmost concern about sharing work with AI tools was of more substantial pieces of work (such as coursework) being plagiarised by other pupils. Parents and particularly pupils’ first assumption was that AI tools could be used by other pupils to generate work that draws heavily from their own work, leading to their efforts being co-opted. Some understood AI ‘learning’ from pupils’ work to mean that AI would then use it to create new pieces of work for other pupils. 

Post-GCSE pupil, Birmingham: 

Not okay to share [Homework] – because your schoolwork is your intellectual property, it’s you and you did that. If the AI takes that then you can’t copyright it. 
It can’t use everyone’s homework so it can be copied and plagiarised. 

Despite this assumption, this concern was only notable for larger pieces of work that pupils spent considerable time on, with little concern about other more routine work produced by pupils (such as class test answers).  

There was also concern from some about pupil work being shared more widely by AI tools, with pupils in particular worrying that this would mean that examples of ‘bad’ work they produced would be circulated among or accessible to other people and cause embarrassment or judgement.  

Further explanation of how work would be used to optimise AI tools, rather than being regurgitated or circulated, provided reassurance to uncertain pupils and parents. Emphasis on the volume of data required to optimise AI tools, and reiterating that an individual piece of work would be one among millions of pieces of pupil work, also reassured some parents and pupils.  

Additionally, some parents noted that examples of high-scoring essays or exam answers were already shared more widely, and did not feel sharing work with AI tools would be cause for more concern.  

However, pupils and parents maintained some doubts about the limitations of AI optimisation, especially in relation to more creative or subjective pieces of work. 

Some parents and pupils were unconvinced by the ability of AI tools to assess work for subjective subjects requiring more nuanced interpretation such as PSHE, or creative subjects like Art and English. They did not feel that pupil work would optimise tools to the extent needed for them to achieve a human level of expertise and understanding, making the use of pupil work feel futile. 

I think it makes sense with the factual subjects, because with science and maths most of the time there is a definitive answer. But like English there is a main answer but there are other right answers too. 

Concerns about plagiarism were also heightened for creative work such as artwork or longer essays, which pupils felt was more obviously valuable intellectual property and could hold more personal significance than written work. As above, they struggled to understand how AI tools could be optimised using this work or to believe that a sufficient level of optimisation could be achieved. 

It wasn’t very clear about the copyright situation, I think that’s a huge thing to know, for all children, some children have been designing logos and stuff from like 13/14. 

Acceptability of the use of different types of pupil work 

Acceptable pieces of work were those felt to be less ‘valuable’, with fewer concerns about them being plagiarised or misinterpreted by the AI: 

Participants were less sure about the use of work that more effort had gone into or that felt more subjective or creative: 

Coursework 

There was more reluctance about the use of more ‘serious’ pieces of work with higher stakes, and more reassurance needed for their use: 

Mock exams 

Exam answers 

5.3 Types of data 

Types of data .

Parents and pupils were most comfortable with anonymised demographic data being used and shared. 

In almost all cases, participants were comfortable with anonymous demographic data being used to optimise AI tools. They particularly recognised the importance of providing AI tools with information on pupils’ ages or year groups, in order to accurately gauge the progress and performance of pupils at this level. 

While there was some confusion about the need for data like gender, most participants were nevertheless fine with it being provided as it was not a threat to pupils’ anonymity. A few parents expressed concern that this data could contribute to unfair bias or discrimination, and some parents and pupils stressed the need for data about gender in particular to be inclusive, reflecting pupils’ own gender identities rather than erasing them. 

You’ve got bias in AI but its already there, probably easier to correct than it is in a person. 

More conditions were attached to the use of pseudonymised and special category data which was seen as requiring more protection, despite recognition of its necessity and openness to its use. 

[On including gender] It depends what it’s being used to train it for. It doesn’t really bother me but bias can happen. 

Parents and pupils understood that in order for AI tools to provide personalised, lifelong support for pupils that is tailored to their educational needs and learning styles, data linkage is necessary via pupil identifiers. There was openness to this due to the potential benefits for pupils and the perception that this tailored support would lead to better outcomes than generic AI use.  

However, participants were deeply concerned about the security of this data, especially special category data, fearing that any breaches would result in comprehensive datasets about individual pupils’ demographics, abilities, and weaknesses being shared more widely and exploited. This was a particular worry for parents of children with SEND, for whom concerns centred around their children’s future opportunities. They were particularly concerned that their child’s SEND status could be shared between government departments which could impact the benefits their child might be entitled to, or about future employers accessing their child’s data via the companies developing AI, impacting their child’s future.  

Both parents and pupils strongly felt if data is pseudonymised, identifiers should be held at a school level and ought not to be shared with tech companies or the government. There should also be stringent restrictions and safeguards in place to ensure the security of this data, with assurances communicated to parents and pupils of how the data is stored, who has access to it, and when and where it will be used. 

Data should only be shared with schools, parents and education department. 

Parents and pupils felt strongly that personally identifiable data should not be used in any circumstance. 

Participants emphasised that data that allows individual pupils to be identified, such as name or date of birth, should not be used. This data was seen as unnecessary for AI optimisation in an educational context, and was deemed to carry too many risks for pupils when linked with the other data being collected, particularly special category data. While parents were more resistant to the use of this data, citing the concerns about future opportunities covered above, pupils also strongly preferred the use of their data in an anonymised or pseudonymised form. 

Acceptability of the use of different types of pupil data 

Use of data that could easily be anonymised and was felt to be relevant to AI understanding of pupils’ work was widely accepted. 

Assurances were needed about data perceived as more sensitive or pseudonymised, particularly to address concerns about data security and storage: 

Pupil identifier 

Information about SEND (or any health conditions) 

Data identifying pupils was unacceptable and felt to be unnecessary: 

Date of birth 

5.4 Control of pupil work and data 

Parents and pupils .

All participants expected to be involved in decisions made around the use of pupil’s work and data, with parents and pupils having final say. 

While parents and pupils didn’t expect to make specific decisions about AI optimisation, they did expect to be consulted on whether and by whom pupil work and data can be used. There was widespread consensus that work and data should not be used without parents’ and/or pupils’ explicit agreement. Parents, in particular, stressed the need for clear and comprehensive information about pupil work and data use and any potential risks relating to data security and privacy breaches.  

Pupils also felt that knowing how their work and data would be used would be important, and that they should have a say alongside their parents, especially if they were old enough (see 7.2 Parent and Pupil permission for further discussion of age at which pupils should have a say). However, they were less likely to require extensive detail about its intended use, reflecting their higher level of comfort with data sharing and acceptance of its necessity in order to benefit from the tools using it. With the understanding that pupil work is their intellectual property, some pupils were more concerned about the use of their work than their data (see 6.1.2 for concerns about work use). 

If child’s work is going to be used/processed in AI the parents should be advised. 

Schools were most trusted to make decisions about the use of pupils’ work and data, as well as to hold data that was seen as more sensitive (such as SEND data or pupil identifiers). Where concerns about school involvement existed, they were centred around unequitable AI use and access. 

Parents and pupils felt that schools could be relied on to make decisions in the best interest of pupils and to prioritise educational outcomes and safety over other considerations like AI development and profit. Central to this trust was the widely held perception that schools are not primarily profit-motivated and are already trusted to safeguard pupils, which led to the assumption that schools can be relied upon to continue doing this when it comes to AI. As a result, participants wanted schools to have the final say in how pupil work and data is used, with the ability to approve or reject uses suggested by the government or tech companies if they are felt to harm pupils or jeopardise their privacy and safety.  

Schools were also trusted to hold pupil data, with many who were uncertain about special category data being shared and used feeling reassured about this data being collected if schools could control its use and guarantee that it would not be shared beyond the school. 

The ID number has to stay within the school and be really safe. 
I would want to feel the school (teachers especially) have all the info and are confident the AI is safe. 

A few parents noted that schools may not all choose to use AI, or that there could be disparities within schools if it were left up to teachers’ discretion and some refused to integrate AI into their teaching. Some worried that schools with fewer resources would be left behind as other schools (such as private schools) adopted AI use to their advantage. There was also a minor question about the impact schools’ teaching philosophies might have on the decision to use AI or not, for example whether religious schools might choose not to use a standardised AI tool in order to have control over what exactly students learn.  

However, there was little real concern about schools’ oversight of AI tools and pupil work or data, with most participants feeling the more control schools have, the better. 

Parent of GCSE pupil, Bristol: 

What about schools that don’t have the facilities? It was hard enough before all this.  
Access is a concern, ensure there’s a level playing field across the board.  

Government 

Parents and pupils saw a role for DfE in setting rules around AI use and (to a lesser extent) pupil work and data, recognising the need for a central authority. However, many participants were worried about potential negative impacts from the use of AI tools and pupil work and data by government.  

Most felt there was a role for DfE having a say in how AI is used in schools, feeling that central guidelines would make AI use more consistent. DfE was also generally trusted to make decisions in the best interest of pupils and with education rather than profit in mind. This was seen to necessitate its involvement in any decisions made by tech companies. 

However, trust in DfE to set rules was predicated on school involvement in decisions made, particularly those around the use of pupil work and data to optimise AI tools. While there was a need for DfE to provide central oversight, parents and pupils were still hesitant to hand over complete control of pupil data. In this, participants’ preferences reflected pre-established views that schools, being closer to pupils and in close communication with them and their parents, were more familiar with pupils’ needs and parents’ concerns, and were therefore more likely to make decisions accordingly. 

Pupil-centric at every stage, profits should be distributed to the schools and [for] development not just led by tech companies, with the education [sector] as well. 

There was a notable tension between the desire and perceived need for robust government oversight, and concern around government involvement. Many parents and pupils worried that other government departments might not make decisions in the best interest of pupils, or might not have the ability to direct efficient, effective, and beneficial use of AI. 

My initial thought is an independent regulatory body so they’re a step away from it but I don’t know what that looks like. 

Parents also worried about how pupils’ performance and special category data (such as SEND status) could be used by government if held in a central database accessible beyond DfE. There were also concerns around how particular agendas might determine the content used to optimise AI and therefore how and what AI tools teach pupils.  

This was a particular concern for parents of children with SEND, who worried that their children’s future could be affected if pseudonymised or personally identifiable data is held and accessed by government beyond their time at school. They required reassurance that data showing their children’s level of ability and any SEND would not be used in future, for example to affect their entitlement to government assistance.  

Many parents also generally worried about increased surveillance if provided with data on children throughout their formative years, particularly if AI use becomes standard and most or all of the population’s data in this context is held and used by a limited number of central organisations. 

Thinking about the work…How long will it be kept there - who will it be shared with and how much of my child’s personal info is attached to it? 

Participants feared that particular viewpoints or biases, including those within the curriculum, could become more entrenched in AI and harder to correct. For these participants, involvement of independent experts within the field of AI and education could mitigate some of this risk by providing a check for decisions and ensuring a balance of views. 

I feel like they’re trying to push the kids in a certain direction, and then the government gets to know everything [decision] they make. 

Tech companies 

Trust in tech companies was extremely limited and there was little to no support for them to be granted control over AI and pupil work and data use. 

Profit was almost universally assumed to be the primary or sole motivation of tech companies, rather than the desire to improve education and pupil outcomes. Reflecting starting views of tech companies as non-transparent and assumptions that data is sold on to third parties, participants did not trust them to protect or use data responsibly. Parents and pupils assumed that given free rein and with no oversight, tech companies would choose to sell data on to other companies with little concern for pupil privacy or wellbeing. 

I think yes, the company is going to benefit, that’s economics, but I think it would be good to give it back to schools. 
Yeah, you kind of want to know what type of people are developing [it], if the people running it are doing it for the wrong reasons, it could get out of hand, you want to know they’re doing it for the right reasons. 

Participants did note that tech companies working in close partnership with schools or DfE, with clear oversight and regulation, would provide some assurances that they would be more likely to use pupil work and data responsibly and to benefit pupils. 

6. Conditions for use 

6.1 summary .

Participants’ identified the following conditions for the use of AI in education and the use of pupils’ work and data to optimise AI tools: 

Human oversight: Human involvement in AI use to correct for error and unfair bias, as well as providing safeguarding. 

Parent and pupil permissions: Providing parents and pupils with the necessary information and the opportunity to make informed decisions about the use of their data. 

Standardisation and regulation: Ensuring that AI tools used within schools are of a uniform standard to avoid exacerbation of inequalities, with strict oversight of any tech companies providing the tools. 

Age and subject restrictions: Using AI tools only where appropriate and where they add value. Strict age restrictions on direct interaction with AI.  

Profit sharing: Ensuring that tech companies that benefit from accessing data share some of their profits so that this can be reinvested into the education system and benefit schools and pupils – while recognising that private companies will need to be incentivised to develop better tools. 

6.2 Human oversight 

Participants stressed the importance of human involvement in AI use at every step of the process. 

Given the recent developments in AI, and the need to continue to optimise it, the use of any tools in the classroom or at home was seen as risky if not overseen by humans, at least to begin with. This concern was particularly pronounced after participants heard about the risks of bias and about AI only being as good as the data it learns from. Many noted that AI can make mistakes or ‘hallucinate’ inaccurate responses, and would need humans to ensure nothing was being taught or assessed incorrectly. There was also an assumption that errors made by AI would be harder to correct than those made by a teacher, which can often be addressed directly by parents or pupils in conversation. This means AI tools should always be checked, with any resources created looked over by teachers, any marking or feedback generated by AI tools reviewed by teachers, and any tests or exams marked by AI being assessed by teachers or external markers.  

Parents were particularly keen that pupils’ AI use is supervised or at least controlled, and that AI tools are never used as a substitute for a teacher. Pupils similarly stressed that learning should not be solely delivered by an AI tool operating independently, as teacher-pupil interaction is highly valued and most felt some level of human subjectivity is always needed. Pupils also worried that AI use without human oversight might mean errors made by AI are overlooked, leading to them not learning the skills they need or being taught incorrectly. Any potential errors should and could be picked up by earlier human assessment of AI outputs. 

Parent of Pre-GCSE pupil, Newcastle: 

The [AI] tool should supplement the teacher, not replace or undermine [the teacher]. A pupil-teacher relationship is still very important for [the pupil’s] development. 

6.3 Parent and pupil agreement for use of work and data 

Both parents and pupils felt they should be enabled to make free and informed decisions about how pupil work and data is used. 

This means having an understanding of when AI tools will be used and why, and how pupil work and data will be used to optimise them and why. Almost all participants felt that agreement should be a pre-condition of AI use. 

Despite consensus that agreement should be required, views around the details of agreement differed: 

Parents emphasised their responsibility to make informed decisions for their children’s wellbeing. They therefore felt their permission ought to be required, particularly for younger pupils (generally those aged under 16). Many were resistant to the idea that their children could make these decisions for themselves, wanting to have a say in all aspects of their children’s education.  

Pupils tended to attach more importance to their own comfort with AI and work and data use, particularly with the understanding that the work they create is their intellectual property. Most pupils we spoke to had experience of permitting data sharing for themselves when signing up to and/or using apps and websites, and most did not view agreeing to work and data use for AI optimisation purposes any differently. While many were happy for their parents to also have a say, some felt this should not supersede their own wishes, and that pupils should have final say over the use of their work and data above a certain age (13 or 16). 

Parent of GCSE pupil, Birmingham:  

Up to 16, it’s definitely a parental choice, but as they start to make their own choices this would be included. 
Might be good to trial with older kids, because we can already consent ourselves and then you could show the parents the positive data. 

Expectations for how permission would be provided varied, but most parents described an “opt-in” model and expected to be given the chance to understand and agree to all potential uses of their child’s data and work. Parents suggested that this agreement could be “staggered” as understanding of AI tools and comfort with its use grows, and that schools and DfE could make decisions about AI use within the parameters of permission provided. Generally, the expectation was that even completely anonymised data and work would require some level of permission to be shared and/or used, though most participants indicated they would agree to its use. However, there was little consideration of how this would work in practice, especially alongside equitable access to AI for all pupils and schools, which was seen as an important condition for its use.   

Generally, pupils expressed higher levels of comfort with sharing their data than parents, many of whom had serious concerns about data privacy, security and storage. A few pupils assumed their parents would lack understanding and would be reluctant to allow them to share their data as a result, in contrast to their own willingness to share it. Many parents noted that widespread AI use and normalisation of data-sharing would make them feel more positively about it and more likely to easily provide permission, assuming that once AI use becomes “tried and tested”, concerns are likely to be alleviated. 

6.4 Standardisation and regulation 

AI use in schools should only be through standardised and strictly regulated tools to ensure quality control and equity of access. 

Parents and pupils stressed that all schools should have access to the same, quality assured, AI tools.  Many suggested this could be provided by certification processes sanctioned by schools and the government, with only AI tools that are officially tested and meet a minimum performance standard being approved for use in education. For many, this would alleviate concerns about some pupils or schools benefitting over others by accessing more developed AI tools than others. 

Concerns about the quality of AI tools also led to worries that pupils could be penalised for, or disadvantaged by, poor teaching or support provided by low-performing AI tools. Pupils worried that they would be held accountable for any errors committed as a result of incorrect AI teaching or support. Parents also wanted guarantees that, in cases where low-performing AI tools led to poor pupil performance, the pupil would not be penalised, and emphasised a need for regulations ensuring clear accountability in case of AI error or misuse. In particular, parents of primary and pre-school children wanted guarantees of accountability in the case of malicious or inappropriate content being propagated by AI tools, along with strong and appropriate content safeguards to ensure they are safe for children to use. 

Parent of Post-GCSE pupil, Newcastle: 

If used in marking exams, make sure its accurate so pupils are not disadvantaged. 

While there was no overall consensus on who ultimately could be held accountable for any issues that arise, many suggested DfE and schools both have a responsibility to ensure AI tools are fit for use, and to minimise and rectify any errors or misuse. Others felt that this responsibility should lie with tech companies, and that as the developers of these tools, they should be made to answer if their use harms pupils. 

Regulation was also felt to be crucial for ensuring stringent data collection, privacy, and security. 

DfE and the wider government were generally seen as responsible for setting, communicating, and maintaining these standards. Parents in particular expected clear rules to be established for:  

How pupil data can be collected; 

For what purpose it can be collected; 

How it will be stored; 

How long it will be stored for; and  

Who can access it.  

Parents emphasised the importance of these regulations being put in place and communicated as a pre-condition for widespread AI use in education. 

6.5 Age and subject restrictions 

Parents and pupils were in agreement that the use of AI tools should be restricted, with the most accepted uses involving older pupils and subjects seen as “objective”. 

There was a general consensus that AI tools would be best used directly by pupils in secondary education, at which point both parents and pupils felt that pupils would be able to confidently and safely interact with the technology. There was less concern about pupils not developing necessary social skills at this point (due to interacting with AI tools alongside teachers), and less concern about the use of pupils’ data and work. Overall, both parents and pupils felt most comfortable with AI tools being directly used by pupils old enough to understand the tools and agree to their use. Parents’ estimation of this age tended to be higher than pupils, as pupils were more likely to set the minimum age at 11 or 13, while many parents felt that pupils would only be able to meaningfully agree at age 16. 

GCSE pupil, Birmingham:  

Maybe it’s not appropriate for young kids, you should have restricted access, and it might not simplify it enough. 

Parents of primary and pre-school pupils were least comfortable with the potential use of AI tools, citing concerns around unintentional exposure to harmful content and children not picking up the skills they need to develop. At this age, the importance of play and socialisation was emphasised, and parents worried these elements of young children’s day-to-day education would be lost or minimised through reliance on AI. 

Both parents and pupils were most comfortable with AI being used to support learning (and particularly to mark work and/or provide feedback) in subjects seen to have more concrete, and therefore more easily assessed answers, such as Science or Maths. These subjects, which contain simple answers (for example, multiple choice), were seen as less likely to confuse AI tools or to be incorrectly assessed due to bias or a lack of understanding. Participants broadly felt reassured that AI tools could be sufficiently optimised to correctly assess these forms of work and would trust their use when overseen by a teacher. 

There was considerably less openness to AI being used to support marking or to assess more creative or subjective subjects like Art, English, Religious Studies or Social Studies. Participants deeply doubted that AI could engage with pupils’ schoolwork on these subjects in the same way as a human, or to grasp their nuances as a teacher would. They also broadly felt that these forms of schoolwork are more personal to pupils, or involve more effort to create, making the stakes of any AI error feel higher. 

Parent of primary school pupil, Bristol: 

You lose being creative, the students being creative, relying on an AI to educate them, and then using AI to do their homework, they’re going to lose that creativity. 

6.6 Profit sharing 

There was widespread consensus that, if profit were to be generated through the use of pupil’s work to enhance AI in education, schools would be the preferred beneficiaries, and resistance to the idea of tech companies being the sole profiteers. 

Generally, parents and pupils acknowledged that pupils profiting individually from the use of their work and data would not be feasible, but almost all strongly believed that any profits derived from this data use should be distributed among schools to enable pupils to benefit. This belief was intensified by the understanding of intellectual property and pupils’ ownership of their work and data. Participants suggested a minimum share of the profits being handed back to schools, but views on how this should be done varied, with many feeling this should be done to maximise equality of access to AI (with profits being used to fund AI tools and resources for schools who are not able to do this themselves), while others felt profits should be equally shared. Few participants thought profits should correspond to each school’s level of data sharing and AI use, and participants were especially positive about profits being used to level the playing field for schools. 

While participants did want schools to profit from AI use, some felt this could happen through profits being used by local authorities or regional bodies to improve education in the area, or by DfE to improve the education sector at a national level, rather than being distributed to individual schools. Most were comfortable with profits being shared between schools and DfE, however, the general assumption was that pupils would benefit most directly if profits were distributed to individual schools. 

Participants accepted that tech companies would profit in some way from the use of pupil work and data, but the consensus was that they should not be the sole beneficiaries. Parents of children with SEND were particularly negative about AI tool development becoming a money-making exercise. Understanding of how exactly tech companies could profit was limited, with most assuming that they would make money by selling pupils’ data to third parties. There was a lack of awareness of other ways in which they might benefit from this data use such as by developing other AI tools for commercial use. On prompting, this form of benefit was generally seen as acceptable if used to develop educational tools for use outside the education sector, but unacceptable if used to develop tools for other purposes. This possibility was seen as misusing data for something other than its intended use, reflecting existing discomfort and concerns about data being sold by tech companies without participants’ knowledge or agreement. 

7. Reflections and implications for future research 

7.1 methodological reflections .

Due to time pressures, the in-person fieldwork was carried out as a single six-hour session per location. Sitting still and processing information for this length of time can be challenging for adults’ attention spans and energy, but it was particularly difficult for pupils. We knew we would need to share large volumes of information, and aimed to make the sessions as engaging as possible by:  

Using different types of stimulus (including animations, videos from experts, worksheets, hands-on demonstrations of AI tools);  

Providing written summaries of all videos; and  

Including activities that would require participants to stand up and move around (including voting exercises). 

However, in the end, we had to adapt our approach in several ways to counteract participant fatigue: 

In the first workshop, we asked participants to compare three different future scenarios, with detailed information about the different use cases of AI in education, the types of data and work that would be used to optimise it, and the conditions in place to regulate its use. This activity took place towards the end of the workshop, and participants found it very challenging to compare such abstract, yet detailed, scenarios. In subsequent workshops, we focussed instead on asking participants to describe the future they would like to see, rather than testing potential scenarios first. 

We gave pupils additional break time after lunch. By this point they had understood the basic principles of machine-learning and this meant they were more refreshed for the final activity where we discussed conditions for use. 

Some lessons for future engagement workshops: 

Including more interactive tools can help to bring concepts to life and keep participants engaged. Participants who had not previously used LLM tools, benefited from being able to see how it works in reality. For future engagements, it may be worth thinking carefully about how devices and applications can be used in sessions. 

There are some practical implications for running joint sessions for parent and pupil groups, as they have different needs. We adapted discussion guides for parents and pupils and, as much as possible, made all stimulus suitable for the youngest sample members. However, it may be worth considering splitting groups, so their agendas are decoupled from one another, allowing more flexibility and further adaptation to suit participants’ age.  

Shorter sessions over several weeks, as well as a mix of in-person and online fieldwork, may be more suitable for complex topics such as this. Online participants, who had a week between workshops, returned to the second session refreshed. In addition, many had used the interim to think about or discuss what they had learnt with friends or family, which meant they brought more nuanced perceptions and opinions to the final session. 

7.2 Areas for future research 

The research showed that awareness, understanding, and opinions of AI are all still evolving. As the technology becomes more established, the public will be further exposed to its applications and form opinions based on those experiences. However, we also know how important the commentary and opinion of others - both expert and lay person - are in shaping views and impacting trust. For parents in particular, other parents are powerful influencers, so it will be important to continue engaging with this audience to understand how they feel about the use of AI in education. 

There are also a number of specific questions surfaced by the research, which we feel warrant further exploration: 

The relationship between private interest and public good : How comfortable are parents and pupils with private companies profiting and how are they held to account and incentivised to ensure they put public good first?  

Oversight and coordination of data sharing : To what extent is there support for the central management and facilitation of data access across government and with researchers and private companies? Would parents and pupils be comfortable with an “EDR UK” organisation, similar to HDR UK, ADR UK, or SDR UK? 

Equal access and opting out : What happens if you want to opt out? And how can we ensure nobody is left behind? 

8. Appendix 

8.1 demographic sample breakdown .

 
Location Bristol 36  
  Birmingham 36  
  Newcastle 36  
Location Type City/Urban 48  
  Suburban/Small town/Large village 32  
  Rural 26  
  Unknown 2  
Gender Male 43  
  Female 65  
Age 18 and under 36  
  19-24 1  
  25-39 22  
  40-59 47  
  60+ 2  
Ethnicity White 79  
  Black, Black British, Caribbean or African 16  
  Asian or Asian British 10  
  Mixed or Multiple ethnic groups 2  
  Other 1  
Feeling about technological developments and uses of AI (parents only) Excited 36  
  Sceptical/Worried 36  
Total   108  

8.2 Expert video breakdown 

 
Head of Government Practice at Faculty Tom Nixon What is AI and why is it important?  
Data Scientist at 10 Downing Street Andreas Varotsis What is machine learning?  
Head of Digital Education at Bourne Educational Trust Chris Goodall Potential benefits of using AI for teachers and pupils  
Head of Digital Learning at Basingstoke College of Technology Scott Hayden Potential benefits of using AI for teachers and pupils  
Digital Strategy at the Department for Education Fay Skevington Potential risks of using AI around data protection, privacy, and IP  
Parliamentary Under-Secretary of State at the Department for Education Baroness Barran The bigger picture: wider risks and benefits of AI use and how to manage them  

Is this page useful?

  • Yes this page is useful
  • No this page is not useful

Help us improve GOV.UK

Don’t include personal or financial information like your National Insurance number or credit card details.

To help us improve GOV.UK, we’d like to know more about your visit today. Please fill in this survey (opens in a new tab) .

  • SI SWIMSUIT
  • SI SPORTSBOOK

Philadelphia Phillies Hero Trolls ESPN Reporter After Walk-Off Hit vs. Braves

Tyler maher | sep 2, 2024.

Sep 1, 2024; Philadelphia, Pennsylvania, USA;  Philadelphia Phillies outfielder Nick Castellanos celebrates his walk-off single.

  • Philadelphia Phillies
  • Atlanta Braves

The Philadelphia Phillies kicked off September by battling the Atlanta Braves on Sunday Night Baseball, and the game was an instant classic.

The fourth and final game of the Labor Day Weekend series at Citizens Bank Park between the NL East rivals was a tense pitching duel from beginning to end. The Braves scratched out a pair of runs off Aaron Nola before the Phillies returned the favor against Spencer Schwellenbach with three straight hits in the bottom of the sixth, tying the score at 2-2.

That's the way it stayed until extra innings as both bullpens were lights-out. Philadelphia's bullpen yielded just two hits over five scoreless frames, while Atlanta's relievers didn't surrender a hit until the bottom of the 11th.

Unfortunately for the Braves, that hit ended the game. With two outs and runners on second and third, Nick Castellanos greeted reliever Grant Holmes with a sizzling single back through the box, giving the Phillies a walk-off 3-2 win.

After celebrating with his teammates, Castellanos had to do the customary post-game interview with ESPN's SportsCenter crew, and he did not appreciate the banal questions.

That first question to Nick Castellanos in the post game interview. 🤔🥴 #Phillies #NickCastellanos #ESPN pic.twitter.com/jZhue7W69k — AC  (@ACinPhilly) September 2, 2024

Host Nicole Briscoe led off the interview by asking, "What did you see on that pitch that allowed you to be, you know, the guy who was mobbed on the field there?"

Castellanos, who was looking at the wrong camera, paused for a second before tersely replying, "Four-seam fastball."

"And how did it work for you?" Briscoe asked.

Castellanos paused again before motioning to the field. "I mean, you saw the game, right? You see the highlight?"

"I did see the highlight," Briscoe responded before continuing with the interview. Fortunately for her, Castellanos at least attempted to answer the rest of her questions.

Moral of the story: don't throw Castellanos a four-seam fastball with the game on the line, and don't ask him dumb interview questions on live TV.

Tyler Maher

TYLER MAHER

Tyler is a writer for Sports Illustrated's Inside the Phillies. He grew up in Massachusetts and is a huge Boston sports fan, especially the Red Sox. He went to Tufts University and played club baseball for the Jumbos. Since graduating, he has worked for MLB.com, The Game Day, FanDuel and Forbes. When he's not writing about baseball, he enjoys running, traveling, and playing fetch with his golden retriever.

  • $ 0.00 0 items

N-Gen Math™ 8

The full experience and value of eMATHinstruction courses are achieved when units and lessons are followed in order.  Students learn skills in earlier units that they will then build upon later in the course.  Lessons can be used in isolation but are most effective when used in conjunction with the other lessons in this course. All Lesson/Homework files, Spanish translations of those files, and videos are available for free.  Other resources, such as answer keys and more, are accessible with a paid  membership .

Each month August through May we release new resources for this course that are accessible with a Teacher Plus membership. We release new resources in unit order throughout the school year.  You can see a list of our new releases by visiting our  blog  and selecting the most recent newsletter.

Standards Alignment – Powered by EdGate

  • Table of Contents for N-Gen Math 8 and Standards Alignment
  • Unit 1 - The Algebra of One Variable
  • Unit 2 - Tools of Geometry
  • Unit 3 - Transformations
  • Unit 4 - Similarity and Dilations
  • Unit 5 - Equations of Lines
  • Unit 6 - Functions
  • Unit 6cc - Functions
  • Unit 7 - Exponents and Roots
  • Unit 8 - The Pythagorean Theorem
  • Unit 9 - Volume and Surface Area of Solids
  • Unit 10 - Scientific Notation
  • Unit 11 - Systems of Equations
  • Unit 12cc - Statistics of Two Variables

Customer Reviews

Love the course.

Share your experience to help others interested in N-Gen Math™ 8.

Leave a Review

Thank you for using eMATHinstruction materials. In order to continue to provide high quality mathematics resources to you and your students we respectfully request that you do not post this or any of our files on any website. Doing so is a violation of copyright. Using these materials implies you agree to our terms and conditions and single user license agreement .

The content you are trying to access  requires a membership . If you already have a plan, please login. If you need to purchase a membership we offer yearly memberships for tutors and teachers and special bulk discounts for schools.

Sorry, the content you are trying to access requires verification that you are a mathematics teacher. Please click the link below to submit your verification request.

IMAGES

  1. Unit 2 Homework 3 KEY 2020

    unit 2 homework 3

  2. Lesson 3 Homework Practice Rates

    unit 2 homework 3

  3. SOLUTION: Unit 2 Logic and Proof Conditional Statements Worksheet

    unit 2 homework 3

  4. Parallel Lines And Transversals Homework

    unit 2 homework 3

  5. Unit 2 Logic And Proof Homework Segment Proofs Answer Key

    unit 2 homework 3

  6. SOLUTION: Unit 2 Logic and Proof Segment Proofs Worksheet

    unit 2 homework 3

VIDEO

  1. Homework [3] unit 4 1st secondary

  2. Cloud computing Concept Part 2 |Homework 3 Answers| Coursera|

  3. "Arranged marriage" {Part-4} {Blmm/Gay} {Unoriginal? } {Watch it now} {Read the description}

  4. Samskrita Bharati

  5. Class 2

  6. Week 2: Homework 3.1-3.3

COMMENTS

  1. Unit 2: Logic & Proof, Homework 3: Conditional Statements

    Unit 2: Logic & Proof, Homework 3: Conditional Statements. If a product of two numbers is 0, one of the numbers must be 0. Click the card to flip 👆. Hypothesis: the product of 2 numbers is 0. Conclusion: at least 1 of the numbers must be 0. Click the card to flip 👆.

  2. PDF Unit 2

    Unit 2: Equations & Inequalities Homework 13: Inequalities Review ** This is a 2-page document! ** Directions: Solve, graph, and write the solution to each inequality in interval notation. -8X432 -g +5 Interval Notation: C 3 3. 7-(5-4x) < 2(3x+8) I la 2- Interval Notation: 2 < -11 -3x -18 Interval Notation: —q) Of

  3. Unit 2 Homework 3 KEY 2020

    Unit 2 Homework 3 KEY 2020. homework key. Subject. English. 999+ Documents. Students shared 5644 documents in this course. Level Standard. School ... The Odyssey Part 2 Homework. English 100% (32) 8. 2.2 - assignments. English 100% (31) 11. Sled Wars And Stuff More Things And The Like. English 98% (80) 5. SHEG Dark Ages Questions.

  4. Answer Keys

    Answer Keys - MS. CHOUCAIR'S Algebra WEBSITE. Answer keys are listed. and organized by unit. When turned in, all assignments must show all work to receive full credit. If your work is submitted identical to the answer keys below. you will receive no credit! UNIT 1: Solving Equations.

  5. Unit 2: Linear Functions Date: Bell: Homework 3: Writing Linear

    The equation of the lines are given by the points on the line from which. the slope and y-intercept are determined. Responses;. 1. y = -6·x - 23. 2. 3. 4. y = 9 - x. 5. a) b) 315 miles 6. a) b) Mikayla will not be ready for the marathon in 12 weeks

  6. Solved Unit 2: Linear Functions Date: Bell: Homework 3:

    Question: Unit 2: Linear Functions Date: Bell: Homework 3: Writing Linear Equations, Applications, & Linear Regression **This is a 2-page documenti ** Point Slope & Two Points: Write a linear equation in slope-intercept form with the given Information 1. slope = -6; passes through (-4,1) 2. slope = passes through (-5, -6) 3. passes through (-4, 11) and (2,8) 4. passes

  7. Common Core Algebra II

    Common Core Algebra II. Rated 5.00 out of 5 based on 9 customer ratings. View Reviews. The full experience and value of eMATHinstruction courses are achieved when units and lessons are followed in order. Students learn skills in earlier units that they will then build upon later in the course.

  8. Solved Open with I Name: Unit 2: Functions & Their Graphs

    1. function is not continuous discontinuity: at x = = - 2 , there is removab …. Open with I Name: Unit 2: Functions & Their Graphs Date: Per: Homework 3: Continuity & End Behavior ** This is a 2-page document! ** Directions: Determine if the function shown on the graph is continuous. If not, identify the type and location of discontinuity.

  9. Algebra 2 Common Core

    Exercise 8. Exercise 9. Exercise 10. Exercise 11. Exercise 12. Exercise 13. Exercise 14. Exercise 15. Find step-by-step solutions and answers to Algebra 2 Common Core - 9780133186024, as well as thousands of textbooks so you can move forward with confidence.

  10. PDF ALG2 Guided Notes

    Algebra 2 -25 - Functions, Equations, and Graphs WARM UP Solve each equation for y. 1) 12y=3x 2) −10y=5x 3) 3 4 y=15x y= 1 4 x y=− 1 2 x y=20x KEY CONCEPTS AND VOCABULARY Direct Variation- a linear function defined by an equation of the form y=kx, where k ≠ 0. Constant of Variation - k, where k = y/x GRAPHS OF DIRECT VARIATIONS

  11. PDF Unit 2

    Unit 2: Algebraic Expressions Homework 3: Distributive Property & Simplifying Expressions 2(2k + 9) 5(3r + 1) —16V — 3p) -15 w 3m +21 6. -(4k +11) Directions: Simplify each expression. 41.4 -5x +22 11. 12(C-3) ... Unit 2 - Algebraic Expressions Key Created Date:

  12. Algebra 2: Homework Practice Workbook

    Our resource for Algebra 2: Homework Practice Workbook includes answers to chapter exercises, as well as detailed information to walk you through the process step by step. With Expert Solutions for thousands of practice problems, you can take the guesswork out of studying and move forward with confidence. Find step-by-step solutions and answers ...

  13. Algebra II

    Khanmigo is now free for all US educators! Plan lessons, develop exit tickets, and so much more with our AI teaching assistant.

  14. ASL 1

    ASL 1 - 2.3 Homework. Course: American Sign Language 1 (CSD 138) 53 Documents. Students shared 53 documents in this course. University: Butler University. Info More info. Download. AI Quiz. ... UNIT . 2 • EXCHANGING . PERSONAL . INFORMATION . More from: American Sign Language 1 (CSD 138) More from: American Sign Language 1 CSD 138. Butler ...

  15. Unit 2

    LESSON/HOMEWORK. LESSON VIDEO. ANSWER KEY. EDITABLE LESSON. EDITABLE KEY. Lesson 2 Function Notation. LESSON/HOMEWORK. LESSON VIDEO. ANSWER KEY. EDITABLE LESSON. ... Unit 2 Mid-Unit Quiz (Through Lesson 3).Form D ASSESSMENT. ANSWER KEY. EDITABLE ASSESSMENT. EDITABLE KEY. Add-on U02.AO.01 - Work With Graphs of Functions RESOURCE. ANSWER KEY ...

  16. Answer Key Chapter 2

    Introduction to Systems of Equations and Inequalities; 7.1 Systems of Linear Equations: Two Variables; 7.2 Systems of Linear Equations: Three Variables; 7.3 Systems of Nonlinear Equations and Inequalities: Two Variables; 7.4 Partial Fractions; 7.5 Matrices and Matrix Operations; 7.6 Solving Systems with Gaussian Elimination; 7.7 Solving Systems with Inverses; 7.8 Solving Systems with Cramer's Rule

  17. Unit 2 Lesson 3 Homework Flashcards

    a volkswagen bug and a volvo truck have a head-on collision. which statement is true. the magnitudes of both forces are the same. Study with Quizlet and memorize flashcards containing terms like newtons 3rd law of motion is the law of, two people standing still pull a rope held between them. who pulls harder, in this lesson we considered the ...

  18. EM4 at Home Grade 3

    Unit 1 Math Tools, Time, and Multiplication. Unit 2 Number Stories and Arrays. Unit 3 Operations. Unit 4 Measurement and Geometry. Unit 5 Fractions and Multiplication Strategies. Unit 6 More Operations. Unit 7 Fractions. Unit 8 Multiplication and Division. Unit 9 Multidigit Operations.

  19. Unit 2 Lesson Plan (Homework) part 3 of 3

    Unit 2_Lesson Plan (Homework) part 3 of 3 math lesson plan teacher: grade: date(s): unit title: count numbers to 30 corresponding unit task: task count on (day

  20. Research on public attitudes towards the use of AI in education

    1. Executive Summary 1.1 Background . The Responsible Technology Adoption Unit (RTA) within the Department for Science, Innovation and Technology (DSIT) commissioned this research in partnership ...

  21. Philadelphia Phillies Hero Trolls ESPN Reporter After Walk-Off Hit vs

    The Braves scratched out a pair of runs off Aaron Nola before the Phillies returned the favor against Spencer Schwellenbach with three straight hits in the bottom of the sixth, tying the score at 2-2.

  22. Common Core Algebra I

    Common Core Algebra I. Rated 4.85 out of 5 based on 20 customer ratings. View Reviews. The full experience and value of eMATHinstruction courses are achieved when units and lessons are followed in order. Students learn skills in earlier units that they will then build upon later in the course.

  23. Unit 3

    Lesson 7. Systems of Linear Equations (Primarily 3 by 3) LESSON/HOMEWORK. LESSON VIDEO. ANSWER KEY. EDITABLE LESSON. EDITABLE KEY.

  24. Unit 2

    We develop general methods for solving linear equations using properties of equality and inverse operations. Thorough review is given to review of equation solving from Common Core 8th Grade Math. Solutions to equations and inequalities are defined in terms of making statements true. This theme is emphasized throughout the unit.

  25. N-Gen Math™ 8

    N-Gen Math™ 8. Rated 5.00 out of 5 based on 4 customer ratings. View Reviews. The full experience and value of eMATHinstruction courses are achieved when units and lessons are followed in order. Students learn skills in earlier units that they will then build upon later in the course. Lessons can be used in isolation but are most effective ...