This role will be responsible of cleaning raw data that contains human, machine or instrument errors and develop data set processes for data production, mining and modeling. Jobholders will need to leverage large volumes of data (structured and unstructured) from internal and external sources to answer business needs of the bank and of key stakeholders. In addition, they will build analytics programs, machine learning, and statistical methods for predictive and prescriptive modeling; develop visualization and dashboards to communicate the results.
Key duties and responsibilities
Data cleansing – identifying, assessing and resolving any data quality issues, develop, and execute data cleanup measures. Evaluate large dataset for quality and accuracy and work with ICT staff to correct data quality errors Determine root cause for data quality errors and make recommendations for long-term solutions. Extract data from a variety of relational databases, manipulate, explore data using quantitative, statistical and visualization tools. Inform the selection of appropriate modeling techniques to ensure that predictive models are developed using rigorous statistical processes. Establish and maintain effective processes for validating and updating predictive models. Analyze, model, and forecast economic patterns/ trends and create the capability to model outcomes of what-if scenarios for novel in economic patterns. Help manage the innovation cycle of conducting analyses, generating insights, advocating for the integration of new concepts into existing tools, helping to translate ad-hoc analyses into scalable solutions. Drives complex analytics projects that require multiple-team collaboration within departments or stakeholders from beginning to end. Build algorithms, predictive models and dashboards that get deployed in the form of products, policy or scientific research.
Qualifications, Experience and Skills
Education and Experience Requirements
Master’s degree in Information Technology, Computer Science, Data Science, Mathematics, Physics, Statistics, electronics engineering, or quantitative field with 2 years of experience. Proficiency in review, analysis and management of large volumes of data using programming languages as Python, R, MATLAB, Tableau and others. Understanding of big data tools, methodology and techniques. Familiarity with analytical techniques in text mining, sentiment analysis, machine learning, deep learning, predictive analysis, event detection and modeling of complex systems. Programming and database skills in any of the following languages/technologies is a plus: JAVA Script, C/C++/C#, SQL, .NET, Python-NLTK, TensorFlow, etc.
Method of ApplicationSubmit your CV and Application on Company Website : Click Here Closing Date : 4th November, 2020.