Job description | Analytics Engineer
Contract type: Permanent
Location: London
Salary: c£65,000 per annum plus civil service pension employer contribution of 28% Higher salary ranges may be available for exceptional candidates.
Hours: Flexible working and part time hours will be considered. The NAO offers hybrid working based on a min of 2 days a week in the office.
Closing date for applications is 23:59pm on Sunday 23 February 2025
Nationality Requirement:
• UK Nationals
• Nationals of Commonwealth countries who have the right to work in the UK
• Nationals from the EU, EEA or Switzerland with (or eligible for) status under the European Union Settlement Scheme (EUSS)
Please note, we are not able to sponsor work visas or accept temporary visas as we are looking to hire on a permanent basis. Please contact the HR Service desk (hrservicedesk@nao.org.uk) should you have any questions on your nationality eligibility.
About the National Audit Office
The National Audit Office (NAO) is the UK’s main public sector audit body. Independent of government, we have responsibility for auditing the accounts of various public sector bodies, examining the propriety of government spending, assessing risks to financial control and accountability, and reviewing the economy, efficiency and effectiveness of programmes, projects, and activities. We report directly to Parliament, through the Committee of Public Accounts of the House of Commons which uses our reports as the basis of its own investigations. We employ approx. 1,000 people, most of whom are qualified accountants, trainees, or technicians. The organisation comprises two service lines: financial audit, and value for money (VFM) audit and has a strong core of highly talented corporate teams.
The NAO welcomes applications from everyone. We value diversity in all its forms and the difference it makes to our organisation. By removing barriers and creating an inclusive culture all our people can develop and maximise their full potential. As members of the Business Disability Forum and the Disability Confident Scheme we guarantee to interview all disabled applicants who meet the minimum criteria.
The NAO supports flexible working and is happy to discuss this with you at application stage.
Context and main purpose of the job:
Introduction:
The analytics engineer is a newly created role within the NAO’s Digital Services (DS) function with responsibility for supporting the development and continual improvement of NAO data & technology service composition and provision. They will support emerging tech to enable the automation or acceleration of relevant NAO processes and derive deeper insights from corporate and client data.
In this capacity, you will transform organizational data into structured formats suitable for analysis and decision-making. You will develop and test data models, explore local data sources, and construct pipelines from corporate repositories to data science and machine learning models. Acting as a domain-specific collaborator to data engineers, you will facilitate the conversion of data into actionable intelligence, thereby contributing to the NAO's commitment to data-driven excellence.
In this role, you will:
Collaborate with subject matter experts and data users to design optimized data structures and models for analysis.
Support data quality improvement and develop standards for data transformation.
Create, maintain, and document data processes to ensure transparency and usability.
Refine requirements based on user feedback and organizational changes.
Provide ongoing support, training, and issue resolution for data users.
Ensure data documentation meets established standards.
This role reports into the XXXX.
This role requires regular attendance at the NAO’s office either in Victoria, London, or at the office in Newcastle.
Responsibilities of the role:
As an analytics engineer, you are responsible for ensuring data is clean, structured, and ready for analysis. You will create data models, automate data processes, and collaborate with stakeholders to support business decisions. Your work makes it easier for us to derive insights from data.
In this role, your responsibilities will include:
Design, build, and maintain data pipelines: Develop and manage robust data pipelines that ensure efficient and reliable data flow from various sources to data storage and processing systems. This includes automating data collection, cleaning, and integration processes to support analytics and reporting needs.
Develop and optimize ETL processes: Create and enhance ETL (Extract, Transform, Load) processes to extract data from multiple sources, transform it into a usable format, and load it into data storage systems. Ensure these processes are efficient, scalable, and maintain high data quality and integrity.
Collaborate with data scientists and analysts: Work closely with data scientists, analysts, and other stakeholders to understand their data requirements and provide the necessary infrastructure and tools. Facilitate seamless data access and support the development of data models and analytical solutions.
Implement and manage data warehousing solutions: Design, implement, and maintain data warehousing solutions that support scalable and efficient data storage and retrieval. Ensure the data warehouse architecture meets the needs of the organization and supports advanced analytics and reporting.
Ensure data governance and security: Implement best practices for data governance, including data privacy, security, and compliance with relevant regulations. Establish and enforce policies and procedures to protect sensitive data and ensure its ethical use.
Optimize database performance: Manage and optimize relational and non-relational databases to ensure efficient data storage, retrieval, and performance. Perform regular database tuning, indexing, and query optimization to maintain high performance and reliability.
Create and maintain documentation: Develop comprehensive documentation for data pipelines, ETL processes, and data architecture. Ensure that documentation is up-to-date, clear, and accessible to relevant stakeholders, facilitating knowledge sharing and continuity.
Monitor and troubleshoot data systems: Proactively monitor data systems for issues, perform root cause analysis, and implement solutions to ensure system reliability. Establish monitoring and alerting mechanisms to detect and address data quality and performance problems promptly.
Support data-driven decision-making: Provide the data infrastructure and tools that enable stakeholders to access and analyze data effectively. Ensure that data is accurate, timely, and accessible, empowering the organization to make informed decisions based on reliable insights.
Stay updated with industry trends: Continuously learn and apply new technologies, tools, and best practices in data engineering and analytics. Stay informed about industry trends and advancements to improve processes, systems, and overall data strategy.
Develop and maintain data documentation: Create and update comprehensive documentation for all data-related processes, including data sources, transformations, and storage. Ensure that documentation is clear, detailed, and accessible to all relevant stakeholders to support transparency and knowledge sharing.
Key skills / competencies required
The skill sets listed also include the corresponding skill level (awareness, working, practitioner, expert):
Communicating between the technical and non-technical: You can listen to the needs of the technical and business stakeholders, and interpret them. You effectively manage stakeholder expectations, using active and reactive communication. You can support or host difficult discussions within the team, or with diverse senior stakeholders. (Skill level: Practitioner)
Data Analysis and Synthesis: You can undertake data profiling and source system analysis. You can present clear insights to colleagues to support the end use of the data. (Skill level: Working)
Data Innovation: You can understand the impact on the organisation of emerging trends in data tools, analysis techniques and data usage. (Skill level: working)
Data Modelling, Cleansing and Enrichment: You can build and review complex data models, ensuring adherence to standards. You can use data integration tools and languages to integrate and store data, advising teams on best practice. You can ensure data for analysis meets data quality standards and is interoperable with other data sets, enabling reuse. You can work with other data professionals to improve modelling and integration patterns and standards. (Skill level: Practitioner)
Metadata Management: You can design an appropriate metadata repository and suggest changes to improve current metadata repositories. You can understand a range of tools for storing and working with metadata, advising others about metadata management. (Skill level: Practitioner)
Problem Management: You can initiate and monitor actions to investigate patterns and trends to resolve patterns, effectively consulting specialists as required. You can determine the appropriate resolution and assist with its implementation, determining preventative measures. (Skill level: Working)
Programming and Build (data engineering): You can use agreed standards and tools to design, code, test, correct and document moderate-to-complex programs and scripts from agreed specifications and subsequent iterations. You can collaborate with others to review specifications where appropriate. (Skill level: Practitioner)
Testing: You can review requirements and specifications, and define test conditions. You can identify issues and risks associated with work, analysing and reporting test activities and results. (Skill level: Working)
Turning business problems into design: You can design data architectures that deals with problems spanning different business areas. You can identify links between problems to devise common solutions. You can work across multiple subject areas, producing appropriate patterns. (Skill level: Practitioner)
Experience
Strong proficiency in data analysis and statistical methods: Demonstrated experience in using tools like SQL, Python, R, and data visualization software (e.g., Tableau, Power BI) to analyze and interpret complex datasets. Capable of applying statistical techniques to derive actionable insights and support data-driven decision-making.
Experience in data engineering and ETL processes: Proven ability to design, build, and maintain data pipelines, ensuring data quality and integrity. Familiarity with data warehousing solutions and cloud platforms (e.g., AWS, Google Cloud, Azure) is essential. Skilled in extracting, transforming, and loading data from various sources to create reliable and scalable data systems.
Proficiency in database management and optimization: Experience in managing and optimizing relational and non-relational databases, ensuring efficient data storage, retrieval, and performance tuning. Knowledgeable in database design principles and best practices to support robust data architectures.
Strong problem-solving and communication skills: Ability to translate business requirements into technical solutions, effectively communicate insights to stakeholders, and collaborate with cross-functional teams to drive data-driven decision-making. Adept at identifying and resolving data-related issues and presenting findings in a clear and concise manner. |
---|