Company

Singtel

Released at

1970-Aug-22

Job Type

Full Time

Salary

SGD 6000 per MONTH

Location

Singapore, , Asia/Singapore

Company Description

Singtel is one of the largest listed Singapore companies on the Singapore Exchange by market capitalisation. The Group has a vast network of offices throughout Asia Pacific, Europe and the USA, and employs more than 23,000 staff worldwide.

Job Description

    Evaluate and renew implemented big data architecture solutions to ensure their relevance and effectiveness in supporting business needs and growth. Design, develop and maintain data pipelines, with a focus on writing scalable, clean, and fault-tolerant code to handle disparate data sources, process large volume of structured / unstructured data from various sources. Understand business requirements and solution designs to develop and implement solutions that adhere to big data architectural guidelines and address business requirements Support and maintain previously implemented big data projects, as well as provide guidance and consultation on other projects in active development as needed Drive optimization, testing and tooling to improve data quality Document and communicate technical complexities completely and clearly to team members and other key stakeholders Develop architecture solutions for varied latency needs like batch, real-time, near-real-time and on-demand APIs. Work closely with our data scientist team to gather data requirements to support modelling Review and approve high level & detailed designs to ensure that the solution delivers to the business needs and at the same time, aligns to the data & analytics architecture principles and roadmap. Help establish and maintain the data governance processes and mechanisms for data lake and EDW Understand various data security standards and use secure data governance tools to apply and adhere to the required controls on a per data set basis for user access Maintain and optimize the performance of our data analytics environment

Requirements

    Degree qualified in Business management, IT, Computer Systems, software or computer engineering fields or equivalent. Minimum 6 years of experience in data warehousing / big data environments. Experience in relational & dimensional data modelling and performance tuning of enterprise warehouses / big data environments. Experience with big data processing (Spark experience preferred) Experience in designing and developing data models, integrating data from multiple sources, building ETL pipelines, and other data wrangling tools in big data environments Understanding of structured and unstructured data design/modeling Experience using software engineering best practices in programming, testing, version control, agile development, etc.

Similar Jobs


No matching job found...

More Jobs Apply Now