Recent Submissions

  • Weighted Graph Compression using Genetic Algorithms

    Rutkowski, Emilia; Department of Computer Science
    Networks are a great way to present information. It is easy to see how different objects interact with one another, and the nature of their interaction. However, living in the technological era has led to a massive surge in data. Consequently, it is very common for networks/graphs to be large. When graphs get too large, the computational power and time to process these networks gets expensive and inefficient. This is common in areas such as bioinformatics, epidemic contact tracing, social networks, and many others. Graph compression is the process of merging nodes that are highly connected into one super-node, thus shrinking the graph. The goal of graph compression is to merge nodes while mitigating the amount of information lost during the compression process. Unweighted graphs are largely studied in this area. However, in this thesis, we extend the approaches to compress weighted graphs via genetic algorithms and analyse the compression from an epidemic point of view. It is seen that edge weights provide vital information for graph compression. Not only this, but having meaningful edge weights is important as different weights can lead to different results. Moreover, both the original edge weights and adjusted edge weights produce different results when compared to a widely used community detection algorithm, the Louvain Algorithm. However, the different results may be helpful to public health officials. Lastly, the NSGA-II algorithm was implemented. It was found that NSGA-II is more suitable as a pre-processing tool, in order to find a target compression that introduces a comfortable level of distortion, and then using the single-objective genetic algorithm to achieve an improved solution for the target.
  • Multi-guide Particle Swarm Optimisation for Dynamic Multi-objective Optimisation Problems

    Jocko, Pawel; Department of Computer Science
    This study investigates the suitability of, and adapts, the multi-guide particle swarm optimisation (MGPSO) algorithm for dynamic multi-objective optimisation problems (DMOPs). The MGPSO is a multi-swarm approach, originally developed for static multi-objective optimisation problems (SMOPs), where each subswarm optimises one of the objectives. It uses a bounded archive that is based on a crowding distance archive implementation. Compared to static optimization problems, DMOPs pose a challenge for meta-heuristics because there is more than one objective to optimise, and the location of the Pareto-optimal set (POS) and the Pareto-optimal front (POF) can change over time. To efficiently track the changing POF in DMOPs using MGPSO, six archive management update approaches, eight archive balance coefficient initialization strategies, and six quantum particle swarm optimisation (QPSO) variants are proposed. To evaluate the adapted MGPSO for DMOPs, a total of twenty-nine well-known benchmark functions and six performance measures were implemented. Three experiments were run against five different environment types with varying temporal and spatial severities. The best strategies from each experiment were then compared with the other dynamic multi-objective optimisation algorithms (DMOAs). An extensive empirical analysis shows that the adapted MGPSO achieves very competitive, and often better, performance compared to existing DMOAs.
  • Mixed Media in Evolutionary Art

    Maslen, Jordan; Department of Computer Science
    This thesis focuses on creating evolutionary art with genetic programming. The main goal of the system is to produce novel stylized images using mixed media. Mixed media on a canvas is the use of multiple artistic effects being used to produce interesting and new images. This approach uses a genetic program (GP) in which each individual in the population will represent their own unique solution. The evaluation method being used to determine the fitness of each individual will be direct colour matching of the GP canvas and target image. The secondary goal was to see how well different computer graphic techniques work together. In particular, bitmaps have not been studied much in evolutionary art. Results show a variety of unique solutions with the application of mixed media.
  • Landscape Aware Algorithm Configuration

    Dennis, Cody; Department of Computer Science
    The issue of parameter selection cannot be ignored if optimal performance is to be obtained from an algorithm on a specific problem or if a collection of algorithms are going to be compared in a fair manner. Unfortunately, adequately addressing the issue of parameter selection is time consuming and computationally expensive. Searching for appropriate control parameters generally requires much more time than actually solving the problem at hand due to the need to perform many complete runs of the target algorithm. The number of runs required to obtain thorough and equal coverage of the parameter space grows exponentially with the number of parameters. As a result, costs associated with parameter selection become a limiting factor in the scale of problems that can be investigated. The primary goal of this work is to reduce the costs of parameter selection. In pursuit of this goal, this thesis examines the use of neural networks to intelligently select appropriate control parameter values based on the characteristics of the problem at hand. Two general purpose approaches are evaluated: one that predicts a single set of control parameters to use throughout a run of the target algorithm; and, another that dynamically adjusts algorithm control parameters at run time. These approaches are examined in detail using the Particle Swarm Optimization algorithm. A comparison with state of the art automated tools for control parameter selection indicates that the cost of parameter selection can be significantly reduced.
  • Strategies for Evolving Diverse and Effective Behaviours in Pursuit Domains

    Cowan, Tyler James; Department of Computer Science
    Evolutionary algorithms have a tendency to overuse and exploit particular behaviours in their search for optimality, even across separate runs. The resulting set of monotonous solutions caused by this tendency is a problem in many applications. This research explores different strategies designed to encourage an interesting set of diverse behaviours while still maintaining an appreciable level of efficacy. Embodied agents are situated within an open plane and play against each other in various pursuit game scenarios. The pursuit games consist of a single predator agent and twenty prey agents, with the goal always requiring the predator to catch as many prey as possible before the time limit is reached. The predator's controller is evolved through genetic programming while the preys' controllers are hand-crafted. The fitness of a solution is first calculated in a traditional manner. Inspired by Lehman and Stanley's novelty search strategy, the fitness is then combined with the diversity of the solution to produce the final fitness score. The original fitness score is determined by the number of captured prey, and the diversity score is determined through the combination of four behaviour measurements. Among many promising results, a particular diversity-based evaluation strategy and weighting combination was found to provide solutions that exhibit an excellent balance between diversity and efficacy. The results were analyzed quantitatively and qualitatively, showing the emergence of diverse and effective behaviours.
  • An Implementation of Separation Logic in Coq

    Wang, Yi; Department of Computer Science
    For certain applications, the correctness of software involved is crucial, particularly if human life is in danger. In order to achieve correctness, common practice is to gather evidence for program correctness by testing the system. Even though testing may find certain errors in the code, it cannot guarantee that the program is error-free. The program of formal verification is the act of proving or disproving the correctness of the system with respect to a formal specification. A logic for program verification is the so-called Hoare Logic. Hoare Logic can deal with programs that do not utilize pointers, i.e., it allows reasoning about programs that do not use shared mutable data structures. Separation Logic extends Hoare logic that allows pointers, including pointer arithmetic, in the programming language. It has four-pointer manipulating commands which perform the heap operations such as lookup, allocation, deallocation, and mutation. We introduce an implementation of separation logic in the interactive proof system Coq. Besides verifying that separation logic is correct, we will provide several examples of programs and their correctness proof.
  • Adaptive Q-learning-supported Resource Allocation Model in Vehicular Fogs

    Hossain, Md Tahmid; Department of Computer Science
    Urban computing has become a significant driver in supporting the delivery and sharing of services, being a strong ally to intelligent transportation. Smart vehicles present computing and communication capabilities that allow them to enable many autonomous vehicular safety and infotainment applications. Vehicular Cloud Computing (VCC) has already proven to be a technology shifting paradigm harnessing the computation resources from on board units from vehicles to form clustered computing units to solve real world computing problems. However, with the rise of vehicular application use and intermittent network conditions, VCC exhibits many drawbacks. Vehicular Fog computing appears as a new paradigm in enabling and facilitating efficient service and resource sharing in urban environments. Several vehicular resource management works have attempted to deal with the highly dynamic vehicular environment following diverse approaches, e.g. MDP, SMDP, and policy-based greedy techniques. However, the high vehicular mobility causes several challenges compromising consistency, efficiency, and quality of service. RL-enabled adaptive vehicular Fogs can deal with the mobility for properly distributing load and resources over Fogs. Thus, we propose a mobility-based cloudlet dwell time estimation method for accurately estimating vehicular resources in a Fog. Leveraging the CDT estimation model, we devise an adaptive and highly dynamic resource allocation model using mathematical formula for Fog selection, and reinforcement learning for iterative review and feedback mechanism for generating optimal resource allocation policy.
  • Optimal Quaternary Hermitian Linear Complementary Dual Codes for Entanglement-Assisted Quantum Error Correction

    Al Jumaily, Maysara; Department of Computer Science
    The objective of this thesis is to find suboptimal and optimal parameters from classical codes and import them into entanglement-assisted quantum codes. The thesis begins by introducing classical error correction, followed by a detailed introduction to quantum computing. Topics that are discussed in the introduction include qubits, quantum phenomena, such as superposition and entanglement, and quantum gates/circuits. The thesis then reviews the basics of quantum error correction and provides Shor's code to reinforce the reader's understanding. Subsequently, the formalism of stabilizer codes is thoroughly examined. We then explain the generalized concept of stabilizer codes which is entanglement-assisted quantum codes. They do not require generators to satisfy the commutativity property. Rather, they utilize the usage of ebits to resolve the anti-commutativity constraint. Next, the thesis explains quaternary field and then the Java program implemented to find the optimal parameters. Lastly, the thesis concludes with presenting the parameters of the new codes that were obtained throughout the research. We have found the suboptimal largest distance for quaternary hermitian linear complementary dual codes that can be imported as entanglement-assisted quantum error correction for parameters [22, 9, 9 or 10]₄, [22, 12, 7 or 8]₄, [23, 8, 11 or 12]₄, [23, 10, 9 or 10]₄, [23, 13, 7 or 8]₄, [24, 10, 10 or 11]₄, [24, 11, 9 or 10]₄, [24, 14, 7 or 8]₄, [25, 12, 9 or 10]₄, [25, 13, 8 or 9]₄, as well as the optimal largest distance for [17, 11, 5]₄ and [17, 13, 3]₄.
  • Managing Diversity and Many Objectives in Evolutionary Design

    BASHER, SHEIKH FAISHAL; Department of Computer Science
    This thesis proposes a new approach to evolving a diversity of high-quality solutions for problems having many objectives. Mouret and Clune's MAP-Elites algorithm has been proposed as a way to evolve an assortment of diverse solutions to a problem. We extend MAP-Elites in a number of ways. Firstly, we introduce a many-objective strategy called sum-of-ranks, which enables problems with many objectives (4 and more) to be considered in the MAP. Secondly, we enhance MAP-Elites by extending it with multiple solutions per "grid" cell (the original MAP-Elites saves only a single solution per cell). A few different ways of selecting cell members for reproduction are also considered. We test the new MAP-Elites strategies on the evolutionary art application of image generation. Using procedural textures, genetic programming is used with upwards of 15 lightweight image features to guide fitness. The goal is to evolve images that share image features with a given target image. Our experiments show that the new MAP-Elites algorithms produce a large number of diverse solutions of varying quality. The extended MAP-Elites algorithm is also statistically competitive compared to vanilla GP in this application domain.
  • Evolving Passive Solar Buildings Using Multi-Behavioural Diversity Search Strategies

    Salma, Umme; Department of Computer Science
    To build a green environment and to plan a sustainable urban area, energy efficient building design plays a major role. Energy efficient measures for building design include heating, cooling, and ventilating, as well as construction materials cost. In passive solar building design, sunlight exposure is used to heat the building in winter and reject heat in summer to keep the building cool. The goals of the passive solar building design are to minimize the energy cost and devices used for heating or cooling. The major goal of this research is to increase the diversity of solutions evolved with an evolutionary system for green building design. An existing genetic programming system for building design is enhanced with a search paradigm called novelty search, which uses measured aspects of designs in an attempt to promote more diverse or novel solutions. Instead of optimizing an objective, novelty search measures behaviors to obtain diverse solutions. We combine novelty search and fitness scores using a many objective strategy called sum of ranks. The simulation software EnergyPlus is used to evaluate the building design and energy costs. An existing fitness-based genetic programming system is enhanced with novelty search. We compare vanilla genetic programming solutions with our novelty-driven solutions. Experimental results show that genetic program solutions are more fit, but novelty strategies create more diverse solutions. For example, novelty search solutions, use a much more diverse selection of building materials.
  • Game Theory-based Allocation Management in VCC Networks

    Tejani, Binal; Department of Computer Science
    Vehicular Ad-hoc Networks (VANETs) have contributed significantly towards improving road traffic management and safety. VANETs, integrated with Vehicular Clouds, enable underutilized vehicular resources for efficient resource management, fulfilling service requests. However, due to the frequently changing network topology of vehicular cloud networks, the vehicles frequently move out of the coverage area of roadside units (RSUs), disconnecting from the RSUs and interrupting the fulfillment of ongoing service requests. In addition, working with heterogeneous vehicles makes it difficult to match the service requests with the varying resources of individual vehicles. Therefore, to address these challenges, this work introduces the concept of clustering resources from nearby vehicles to form Combined Resource Units (CRUs). These units contribute to maximizing the rate of fulfillment of service requests. CRU composition is helpful, especially for the heterogeneity of vehicles, since it allows clustering the varying resources of vehicles into a single unit. The vehicle resources are clustered into CRUs based on three different sized pools, making the service matching process more time-efficient. Previous works have adopted stochastic models for resource clustering configurations. However, this work adopts distinct search algorithms for CRU composition, which are computationally less complex. Results showed that light-weight search algorithms, such as selective search algorithm (SSA), achieved close to 80% of resource availability without over-assembling CRUs in higher density scenarios. Following CRU composition, a game-theoretical approach is opted for allocating CRUs to service requests. Under this approach, the CRUs play a non-cooperative game to maximize their utility, contributing to factors such as fairness, efficiency, improved system performance and reduced system overhead. The utility value takes into account the RSS (Received Signal Strength) value of each CRU and the resources required in fulfilling a request. Results of the game model showed that the proposed approach of CRU composition obtained 90% success rate towards matching and fulfilling service requests.
  • Multi-Guide Particle Swarm Optimization for Large-Scale Multi-Objective Optimization Problems

    Madani, Amirali; Department of Computer Science
    Multi-guide particle swarm optimization (MGPSO) is a novel metaheuristic for multi-objective optimization based on particle swarm optimization (PSO). MGPSO has been shown to be competitive when compared with other state-of-the-art multi-objective optimization algorithms for low-dimensional problems. However, to the best of the author’s knowledge, the suitability of MGPSO for high-dimensional multi-objective optimization problems has not been studied. One goal of this thesis is to provide a scalability study of MGPSO in order to evaluate its efficacy for high-dimensional multi-objective optimization problems. It is observed that while MGPSO has comparable performance to state-of-the-art multi-objective optimization algorithms, it experiences a performance drop with the increase in the problem dimensionality. Therefore, a main contribution of this work is a new scalable MGPSO-based algorithm, termed cooperative co-evolutionary multi-guide particle swarm optimization (CCMGPSO), that incorporates ideas from cooperative PSOs. A detailed empirical study on well-known benchmark problems comparing the proposed improved approach with various state-of-the-art multi-objective optimization algorithms is done. Results show that the proposed CCMGPSO is highly competitive for high-dimensional problems.
  • MDP-based Vehicular Network Connectivity Model for VCC Management

    Saad, Abubakar; Department of Computer Science
    Vehicular Cloud computing is a new paradigm in which vehicles collaboratively exchange data and resources to support services and problem-solving in urban environments. Characteristically, such Clouds undergo severe challenging conditions from the high mobility of vehicles, and by essence, they are rather dynamic and complex. Many works have explored the assembling and management of Vehicular Clouds with designs that heavily focus on mobility. However, a mobility-based strategy relies on vehicles' geographical position, and its feasibility has been questioned in some recent works. Therefore, we present a more relaxed Vehicular Cloud management scheme that relies on connectivity. This work models uncertainty and considers every possible chance a vehicle may be available through accessible communication means, such as vehicle-to-everything (V2X) communications and the vehicle being in the range of road-side units (RSUs) for data transmissions. We propose an markov-decisision process (MDP) model to track vehicles' connection status and estimate their reliability for data transmissions. Also, from analyses, we observed that higher vehicle connectivity presents a trace of repeated connection patterns. We reinforce the connectivity status by validating it through an availability model to distinguish the vehicles which support high availability regardless of their positioning. The availability model thus determines the suitability of the MDP model in a given environment.
  • Analysis of the Niching Particle Swarm Optimization Algorithm

    Crane, Tyler; Department of Computer Science
    Multimodal optimization (MMO) techniques have been researched and developed over the years to track multiple global optima concurrently. MMO algorithms extend traditional unimodal optimization algorithms by using search strategies built around forming niches for multiple possible solutions. NichePSO was one of the first approaches to utilize particle swarm optimization (PSO) for MMO problems, using several small subswarms of agents working concurrently to form niches within the search space. Despite its promising performance NichePSO does suffer from some problems, and very little research has been done to study and improve upon the algorithm over the years. A main goal of this thesis is to analyze the NichePSO algorithm, gaining insight into the strengths and weaknesses of the algorithm. Empirical analyses were performed to study the NichePSO’s ability to maintain niches within complex problem domains, as well as methods for improving the overall performance and effectiveness of the algorithm. Two variants of the NichePSO algorithm are proposed, and experimental results show that they both significantly improve the performance of the NichePSO algorithm across several benchmark functions.
  • Comparative Study On Cooperative Particle Swarm Optimization Decomposition Methods for Large-scale Optimization

    Clark, Mitchell; Department of Computer Science
    The vast majority of real-world optimization problems can be put into the class of large-scale global optimization (LSOP). Over the past few years, an abundance of cooperative coevolutionary (CC) algorithms has been proposed to combat the challenges of LSOP’s. When CC algorithms attempt to address large scale problems, the effects of interconnected variables, known as variable dependencies, causes extreme performance degradation. Literature has extensively reviewed approaches to decomposing problems with variable dependencies connected during optimization, many times with a wide range of base optimizers used. In this thesis, we use the cooperative particle swarm optimization (CPSO) algorithm as the base optimizer and perform an extensive scalability study with a range of decomposition methods to determine ideal divide-and-conquer approaches when using a CPSO. Experimental results demonstrate that a variety of dynamic regrouping of variables, seen in the merging CPSO (MCPSO) and decomposition CPSO (DCPSO), as well varying total fitness evaluations per dimension, resulted in high-quality solutions when compared to six state-of-the-art decomposition approaches.
  • IMPROVING BWA-MEM WITH GPU PARALLEL COMPUTING

    Li, Connor; Department of Computer Science
    Due to the many advances made in designing algorithms, especially the ones used in bioinformatics, it is becoming harder and harder to improve their efficiencies. Therefore, hardware acceleration using General-Purpose computing on Graphics Processing Unit has become a popular choice. BWA-MEM is an important part of the BWA software package for sequence mapping. Because of its high speed and accuracy, we choose to parallelize the popular short DNA sequence mapper. BWA has been a prevalent single node tool in genome alignment, and it has been widely studied for acceleration for a long time since the first version of the BWA package came out. This thesis presents the Big Data GPGPU distributed BWA-MEM, a tool that combines GPGPU acceleration and distributed computing. The four hardware parallelization techniques used are CPU multi-threading, GPU paralleled, CPU distributed, and GPU distributed. The GPGPU distributed software typically outperforms other parallelization versions. The alignment is performed on a distributed network, and each node in the network executes a separate GPGPU paralleled version of the software. We parallelize the chain2aln function in three levels. In Level 1, the function ksw\_extend2, an algorithm based on Smith-Waterman, is parallelized to handle extension on one side of the seed. In Level 2, the function chain2aln is parallelized to handle chain extension, where all seeds within the same chain are extended. In Level 3, part of the function mem\_align1\_core is parallelized for extending multiple chains. Due to the program's complexity, the parallelization work was limited at the GPU version of ksw\_extend2 parallelization Level 3. However, we have successfully combined Spark with BWA-MEM and ksw\_extend2 at parallelization Level 1, which has shown that the proposed framework is possible. The paralleled Level 3 GPU version of ksw\_extend2 demonstrated noticeable speed improvement with the test data set.
  • Neural Network Guided Evolution of L-system Plants

    Xuhao(Eric), Chen; Department of Computer Science
    A Lindenmayer system is a parallel rewriting system that generates graphic shapes using several rules. Genetic programming (GP) is an evolutionary algorithm that evolves expressions. A convolutional neural network(CNN) is a type of neural network which is useful for image recognition and classification. The goal of this thesis will be to generate different styles of L-system based 2D images of trees from scratch using genetic programming. The system will use a convolutional neural network to evaluate the trees and produce a fitness value for genetic programming. Different architectures of CNN are explored. We analyze the performance of the system and show the capabilities of the combination of CNN and GP. We show that a variety of interesting tree images can be automatically evolved. We also found that the success of the system highly depends on CNN training, as well as the form of the GP's L-system language representation.
  • Epidemic Simulation and Mitigation via Evolutionary Computation

    Dubé, Michael; Department of Computer Science
    A global pandemic remains a public health event that presents a unique and unpredictable challenge for those making health related decisions and the populations who experience the virus. Though a pandemic also provides the opportunity for researchers and health administrations around the world to mobilize in the fields of epidemiology, computer science, and mathematics to generate epidemic models, vaccines, and vaccination strategies to mitigate unfavourable outcomes. To this end, a generative representation to create personal contact networks, representing the social connections within a population, known as the Local THADS-N generative representation is introduced and expanded upon. This representation uses an evolutionary algorithm and is modified to include new local edge operations improving the performance of the system across several test problems. These problems include an epidemic's duration, spread through a population, and closeness to past epidemic behaviour. The system is further developed to represent sub-communities known as districts, better articulating epidemics spreading within and between neighbourhoods. In addition, the representation is used to simulate four competing vaccination strategies in preparation for iterative vaccine deployment amongst a population, an inevitability when considering the lag inherent to developing vaccines. Finally, the Susceptible-Infected-Removed (SIR) model of infection used by the system is expanded in preparation for adding an asymptomatic state of infection as seen within the COVID-19 pandemic.
  • Data mining using L-fuzzy concept analysis.

    Saha, Sajal; Department of Computer Science
    Association rules in data mining are implications between attributes of objects that hold in all instances of the given data. These rules are very useful to determine the properties of the data such as essential features of products that determine the purchase decisions of customers. Normally the data is given as binary (or crisp) tables relating objects with their attributes by yes-no entries. We propose a relational theory for generating attribute implications from many-valued contexts, i.e, where the relationship between objects and attributes is given by a range of degrees from no to yes. This degree is usually taken from a suitable lattice where the smallest element corresponds to the classical no and the greatest element corresponds to the classical yes. Previous related work handled many-valued contexts by transforming the context by scaling or by choosing a minimal degree of membership to a crisp (yes-no) context. Then the standard methods of formal concept analysis were applied to this crisp context. In our proposal, we will handle a many-valued context as is, i.e., without transforming it into a crisp one. The advantage of this approach is that we work with the original data without performing a transformation step which modifies the data in advance.
  • Object Classification using L-Fuzzy Concept Analysis

    Addison, George Tsekpetse; Department of Computer Science
    Object classification and processing have become a coordinated piece of modern industrial manufacturing systems, generally utilized in a manual or computerized inspection process. Vagueness is a common issue related to object classification and analysis such as the ambiguity in input data, the overlapping boundaries among the classes or regions, and the indefiniteness in defining or extracting features and relations among them. The main purpose of this thesis is to construct, define, and implement an abstract algebraic framework for L-fuzzy relations to represent the uncertainties involved at every stage of the object classification. This is done to handle the proposed vagueness that is found in the process of object classification such as retaining information as much as possible from the original data for making decisions at the highest level making the ultimate output or result of the associated system with least uncertainty.

View more