GET INVOLVED

Projects

OpenSI projects

OpenSI is still in early stages of establishment, however, there are several initiatives and emerging projects that are currently being considered. Each project is described as part of one of our research themes and a Technology Readiness Level (TRL) is applied to projects to indicate the current state and intended state on completion.

Jump to:


Ongoing

Quantum Machine Learning for Image Classification

THEME

Quantum Machine Learning for Image Classification

TIMEFRAME

3.5 – 4 years

PROJECT TYPE

PhD in Information Technology

TRL SCORE

TRL3 to TRL5

DESCRIPTION

This project delves into the exploration of quantum machine learning algorithms for image classification tasks and investigate embedding techniques. It seeks to leverage the power of quantum computing to enhance traditional kernel methods used in image classification. As part of the collaborative effort between academia and industry, this initiative aligns with the goals of the OpenSI Institute, fostering innovation and knowledge exchange. Objectives include investigating quantum kernels, developing quantum machine learning models, and integrating them into existing image classification frameworks. Key components involve quantum computing platforms, Python, quantum libraries like Pennylane or TensorFlow Quantum, classical machine learning frameworks, and large-scale image datasets. Anticipated outcomes include improved classification accuracy, scalability of quantum algorithms, and insights into the potential of quantum machine learning for image analysis tasks. This initiative highlights the transformative impact of quantum computing on traditional machine learning paradigms, paving the way for advanced image classification techniques.
This project addresses a pressing need in the field of image classification, where traditional methods may struggle with complex datasets. By exploring quantum kernels, we aim to unlock new avenues for more accurate and efficient classification algorithms.

SKILLS/EXPERTISE

The ideal PhD candidate for this project should have a solid foundation in computer science, mathematics, or related fields. They should possess expertise in machine learning, particularly in image classification, and have a strong understanding of quantum computing concepts. Proficiency in programming languages such as Python is essential, along with familiarity with quantum libraries and frameworks. Additionally, the candidate should demonstrate critical thinking skills and a passion for exploring cutting-edge technologies.

Ongoing

Leveraging Open-Source Technology for AI Automation with Apache Cassandra

THEME

AI automation

TIMEFRAME

3.5 years

PROJECT TYPE

PhD in Information Technology

TRL SCORE

TRL3 to TRL5

DESCRIPTION

The project aims to develop an AI-driven tool leveraging the open-source Cadence workflow engine to automate application development. This initiative, part of the OpenSI Institute formed by NetApp Australia and the University of Canberra, targets university-industry collaboration using open-source technologies. Objectives include platform development, enterprise service discovery for automation, migration of existing codebases to Cadence, and open-sourcing the tool. Key components involve Cadence, Python, TensorFlow, Apache Cassandra/PostgreSQL vector database, Docker, Kubernetes, and RESTful APIs. Anticipated outcomes encompass enhanced efficiency through plain language-based code generation, maximised value of existing codebases via auto-discovery and migration, scalability, reliability, and fault tolerance through Cadence, and cost-effectiveness through open-source technologies. This initiative underscores the transformative potential of AI and open-source collaboration in driving innovation, efficiency, and sustainability in application development.

This is a valuable problem to solve as the “durable function” style of workflow execution that Cadence provides has proven itself to be a superior approach (in terms of maintainability, scalability and reliability) for many stateful applications. However, porting existing applications to this approach is very time consuming.
Embedded within the Ph.D. in Information Technology program at the University of Canberra, this opportunity is uniquely tailored for working professionals. The program’s flexibility accommodates career commitments while offering strong industry connections that provide valuable networking opportunities and insights into real-world applications.

SKILLS/EXPERTISE

The ideal PhD candidate for this project would possess a strong background in computer science, data engineering, or a related field. They should have experience and expertise in open-source technologies, particularly in AI, machine learning, and data management. The candidate should demonstrate proficiency in programming languages such as Python and familiarity with frameworks like TensorFlow.

Completed

Fast Fourier Transform (FFT) based compression

THEME

Data solutions at scale

TIMEFRAME

Less than 12 months

PROJECT TYPE

Embedded project

TRL SCORE

TRL5 to TRL7

DESCRIPTION

This project aims to leverage signal processing to compress time series data by encoding data as an audio wave and compressing it using lossless audio codecs.

An initial proof of concept has been able to demonstrate between 17 and 50 times a reduction in raw data size for time series-based data.

Successful applicant(s) will work embedded within the Instaclustr R&D team to help advance this initial proof of concept to a state of least technical demonstration.

SKILLS/EXPERTISE

Experience in signal processing and data compression.

Closed

Bloom filter encrypted indexes

THEME

Cyber security and privacy

TRL SCORE

TRL 6 to TRL 8

PROJECT TYPE

Embedded project

DESCRIPTION

This project aims to use an existing body of knowledge to further develop a Multidimensional Bloom Filter index for Apache Cassandra® that allows users to search sets without revealing the data they are searching for.

Some initial work on this project has been commenced and results can be found here.

SKILLS/EXPERTISE

Experience in encryption and set theory.

Closed

Fast horizontal scaling of distributed systems

THEME

Data solutions at scale

TIMEFRAME

6-12 months

PROJECT TYPE

Project

TRL SCORE

TRL3 to TRL5

DESCRIPTION

Horizontal scaling (adding or removing nodes) of distributed systems is typically a time consuming and computationally expensive operation requiring significant volumes of data to be copied across the network.

Initial research has identified that with many distributed data systems it may be possible to trade off the time and compute requirements of the horizontal scaling against other attributes of the system and that this would be beneficial in many use cases.

This project would investigate potential trade-offs and implementation approaches to at least proof of concept stage with a distributed data system such as Apache Cassandra® or Apache Kafka®.

SKILLS/EXPERTISE

Distributed data systems. Apache Cassandra® or Apache Kafka® knowledge particularly beneficial.

Our research themes

Data solutions at scale

Projects that aim to advance technical capability in effectively scaling large datasets and/or data infrastructure systems and software.

Cyber security and privacy

Projects that aim to increase cyber and/or privacy resilience or protect against various cyber threats.

Artificial intelligence and machine learning

Projects that aim to further advance capability in AI or ML.

Open source business models

Research into successful models and approaches to creating, licensing, managing and collaborating on open source projects and capabilities.

Be social:

Close Search