Data Scientist

Remote
Apply Now 🚀

Background:  

Did you know that the biggest contributor to the Australian economy is small to medium businesses?  Australia’s 2 million SMEs employ over half the country’s workforce, account for a third of the GDP, and are the lifeblood of communities around the country.

We all know an Aussie business owner. It could be your partner, your mother, brother, or the person you buy your morning coffee from. People start a business to be their own boss, pursue their passion and to showcase their skills. But the reality is very different…It’s long days, sleepless nights, non-existent weekends and endless financial admin.

For every sale, there is an invoice to fill, a staff member to pay and an expense to be reconciled.  In fact, small businesses spend over 6 hours a week on financial admin. This equates to a staggering 42 days of wasted productivity every year.

About Thriday:

Thriday is a revolutionary new product that combines everything a business needs into one seamless and automated solution.   We have created a product that is the fastest way for small businesses to manage their banking, accounting and tax obligations.

Thriday acts like a CFO in your pocket, taking care of everything a business needs.  Using the data and insights from a Thriday business transaction account and debit card, Thriday offers intelligent value-adds like expense tracking, invoicing, tax forecasting, payroll and more. These tools help win back time for busy business owners, but more importantly, it allows them to plan for the future with real confidence.

These valuable features are made possible by our blank-sheet technical stack, which leverages AI & ML fed by a real-time stream of data from the core business transaction account. No legacy systems, no antiquated technology - just a fresh new approach.

The Role:

We are looking for a Data Scientist to join our data science team. You will have a passion for discovering values hidden in large data sets and working with product team to improve model’s outcomes. This is a perfect opportunity for the successful candidate to become a part of an innovative and energetic team.

To succeed at this role, you must be confident in working with big data, understanding the data and applying AI techniques to extract the hidden value within big datasets, source data from multiple datasets through APIs and integrate all this data as well as predicting future trends by correlating data from multiple sources.

Key Responsibilities include:

  • Work with large amounts of transactional datasets and categorise the transactions using AI based models
  • Analysing large amounts of information to discover trends and patterns
  • Processing, cleansing, and verifying the integrity of data used for analysis
  • Undertaking pre-processing of structured and unstructured data
  • Extending company’s data with third party sources of information when needed
  • Using analytics finding to select key features relevant for building analytic systems/machine learning models (Feature engineering)
  • Building self-learning predictive models with advanced machine-learning algorithms
  • Combining models through ensemble modelling
  • Presenting information using data visualisation techniques
  • Keeping up-to-date with latest technology trends
  • Doing ad-hoc analysis and presenting results in a clear manner
  • Confident in using APIs to source data from multiple sources
  • Develop ETL jobs (in Azure data factory or similar technologies)
  • Creating automated systems and constant tracking of their performance.

Experience required:

  • Experienced at using data modelling languages (Python is must, R, etc.) to manipulate data, correlate the key features, build machine learning models and draw insights from large data sets
  • Guru in advanced statistical techniques and concepts (clustering, classifications using different algorithms such as decision forest, neural networks (CNN and RNN) and statistical tests, etc.)
  • Experienced in scheduling AI models to run in batch mode
  • Confident with Azure cloud environment and tools such as Data Factory, Databricks, Azure Cognitive Analytics
  • Confident working with REST APIs/ APIs
  • Confident working with Bitbucket/Github/Gitlab or similar tools
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
  • Someone with strong problem-solving skills
  • Someone with 3+ years of experience in a Data Science role.

Our values:

Driven by purpose  

Helping businesses succeed is why we show up every day. We invest time into understanding what matters most in accomplishing our purpose, and use data to make sound decisions and determine the best course of action.  

Committed to action

By setting clear goals we act with focus, determination, resilience and ownership to deliver our commitments. We keep moving forward in pursuit of the best outcomes and willingly take on worthwhile challenges.

Embrace ingenuity  

We have the desire and autonomy to seek new ideas, try new things and take calculated risks that contribute to our customer’s success. We have ambition in our heart and always consider the possibilities, because what is good today won't be good enough for tomorrow.

Continuous growth  

Our enthusiasm to constantly learn, build knowledge and improve what we do means we can achieve anything. We grow together by tracking our performance, sharing constructive feedback, collectively adopting a growth mindset and see our mistakes as an opportunity to learn and grow. If we thrive so will our customers.

In trust we thrive  

We empower each other and our partners to perform with confidence and creativity, believing those who are closest to the decision will make the best decisions.  We are always honest and transparent and expect the same in return and genuinely recognise the efforts and achievements of others. Trust bonds us together.

Time to get on our radar 📡

Thank you for applying! We'll be in touch soon.
Oops, something went wrong. Please try again