Unit Award Scheme

116061 COMPUTATIONAL THINKING: UNDERSTANDING NUMBER BASES

In successfully completing this unit, the Learner will have

Evidence needed

acquired an understanding of

1different number bases, ie binary, denary (decimal) and hexadecimal, and why they are used in computers Student completed work

demonstrated the ability to

2convert between binary and denary, binary and hexadecimal and denary and hexadecimal Student completed work
3add together three binary numbers up to a maximum of 8-bitsStudent completed work

shown knowledge of

4how the binary, denary (decimal) and hexadecimal number systems can be used to represent whole numbers Summary sheet
5the principles of binary additionStudent completed work

acquired an understanding of

6the purpose of data and instructions in computer systems being represented in binary form, and why hexadecimal is often used in computer scienceSummary sheet
7how a computer recognises data as a series of 1s and 0sSummary sheet
8what a bit is when working with binary in a computational thinking contextSummary sheet
9how the binary, denary (decimal) and hexadecimal number systems work.Summary sheet

All outcomes recorded on an AQA Summary Sheet

Approved 9 August 2021Level - Level Two