• Managing Diversity and Many Objectives in Evolutionary Design

      BASHER, SHEIKH FAISHAL; Department of Computer Science
      This thesis proposes a new approach to evolving a diversity of high-quality solutions for problems having many objectives. Mouret and Clune's MAP-Elites algorithm has been proposed as a way to evolve an assortment of diverse solutions to a problem. We extend MAP-Elites in a number of ways. Firstly, we introduce a many-objective strategy called sum-of-ranks, which enables problems with many objectives (4 and more) to be considered in the MAP. Secondly, we enhance MAP-Elites by extending it with multiple solutions per "grid" cell (the original MAP-Elites saves only a single solution per cell). A few different ways of selecting cell members for reproduction are also considered. We test the new MAP-Elites strategies on the evolutionary art application of image generation. Using procedural textures, genetic programming is used with upwards of 15 lightweight image features to guide fitness. The goal is to evolve images that share image features with a given target image. Our experiments show that the new MAP-Elites algorithms produce a large number of diverse solutions of varying quality. The extended MAP-Elites algorithm is also statistically competitive compared to vanilla GP in this application domain.
    • MDP-based Vehicular Network Connectivity Model for VCC Management

      Saad, Abubakar; Department of Computer Science
      Vehicular Cloud computing is a new paradigm in which vehicles collaboratively exchange data and resources to support services and problem-solving in urban environments. Characteristically, such Clouds undergo severe challenging conditions from the high mobility of vehicles, and by essence, they are rather dynamic and complex. Many works have explored the assembling and management of Vehicular Clouds with designs that heavily focus on mobility. However, a mobility-based strategy relies on vehicles' geographical position, and its feasibility has been questioned in some recent works. Therefore, we present a more relaxed Vehicular Cloud management scheme that relies on connectivity. This work models uncertainty and considers every possible chance a vehicle may be available through accessible communication means, such as vehicle-to-everything (V2X) communications and the vehicle being in the range of road-side units (RSUs) for data transmissions. We propose an markov-decisision process (MDP) model to track vehicles' connection status and estimate their reliability for data transmissions. Also, from analyses, we observed that higher vehicle connectivity presents a trace of repeated connection patterns. We reinforce the connectivity status by validating it through an availability model to distinguish the vehicles which support high availability regardless of their positioning. The availability model thus determines the suitability of the MDP model in a given environment.
    • Mixed Media in Evolutionary Art

      Maslen, Jordan; Department of Computer Science
      This thesis focuses on creating evolutionary art with genetic programming. The main goal of the system is to produce novel stylized images using mixed media. Mixed media on a canvas is the use of multiple artistic effects being used to produce interesting and new images. This approach uses a genetic program (GP) in which each individual in the population will represent their own unique solution. The evaluation method being used to determine the fitness of each individual will be direct colour matching of the GP canvas and target image. The secondary goal was to see how well different computer graphic techniques work together. In particular, bitmaps have not been studied much in evolutionary art. Results show a variety of unique solutions with the application of mixed media.
    • Modal and Relevance Logics for Qualitative Spatial Reasoning

      Ghosh, Pranab kumar; Department of Computer Science
      Qualitative Spatial Reasoning (QSR) is an alternative technique to represent spatial relations without using numbers. Regions and their relationships are used as qualitative terms. Mostly peer qualitative spatial reasonings has two aspect: (a) the first aspect is based on inclusion and it focuses on the ”part-of” relationship. This aspect is mathematically covered by mereology. (b) the second aspect focuses on topological nature, i.e., whether they are in ”contact” without having a common part. Mereotopology is a mathematical theory that covers these two aspects. The theoretical aspect of this thesis is to use classical propositional logic with non-classical relevance logic to obtain a logic capable of reasoning about Boolean algebras i.e., the mereological aspect of QSR. Then, we extended the logic further by adding modal logic operators in order to reason about topological contact i.e., the topological aspect of QSR. Thus, we name this logic Modal Relevance Logic (MRL). We have provided a natural deduction system for this logic by defining inference rules for the operators and constants used in our (MRL) logic and shown that our system is correct. Furthermore, we have used the functional programming language and interactive theorem prover Coq to implement the definitions and natural deduction rules in order to provide an interactive system for reasoning in the logic.
    • Modeling Metal Protein Complexes from Experimental Extended X-ray Absorption Fine Structure using Computational Intelligence

      Price, Collin; Department of Computer Science (Brock University, 2014-10-30)
      Experimental Extended X-ray Absorption Fine Structure (EXAFS) spectra carry information about the chemical structure of metal protein complexes. However, pre- dicting the structure of such complexes from EXAFS spectra is not a simple task. Currently methods such as Monte Carlo optimization or simulated annealing are used in structure refinement of EXAFS. These methods have proven somewhat successful in structure refinement but have not been successful in finding the global minima. Multiple population based algorithms, including a genetic algorithm, a restarting ge- netic algorithm, differential evolution, and particle swarm optimization, are studied for their effectiveness in structure refinement of EXAFS. The oxygen-evolving com- plex in S1 is used as a benchmark for comparing the algorithms. These algorithms were successful in finding new atomic structures that produced improved calculated EXAFS spectra over atomic structures previously found.
    • Modelling and Proving Cryptographic Protocols in the Spi Calculus using Coq

      Tonet, Adam; Department of Computer Science
      The spi calculus is a process algebra used to model cryptographic protocols. A process calculus is a means of modelling a system of concurrently interacting agents, and provides tools for the description of communications and synchronizations between those agents. The spi calculus is based on Robin Milner's pi calculus, which was itself based upon his Calculus of Communicating Systems (CCS). It was created by Martin Abadi and Andrew D. Gordon as an expansion of the pi calculus intended to focus on cryptographic protocols, and adds features such as the encryption and decryption of messages using keys. The Coq proof system is an interactive theorem prover that allows for the definition of types and functions, and provides means by which to prove properties about them. The spi calculus has been implemented in Coq and subsequently used to model and show an example proof of a property about a simple cryptographic protocol. This required the implementation of both the syntax and the semantics of the calculus, as well as the rules and axioms used to manipulate the syntax in proofs. We discuss the spi calculus in detail as defined by Abadi and Gordon, then the various challenges faced during the implementation of the calculus and our rationale for the decisions made in the process.
    • Multi-Guide Particle Swarm Optimization for Large-Scale Multi-Objective Optimization Problems

      Madani, Amirali; Department of Computer Science
      Multi-guide particle swarm optimization (MGPSO) is a novel metaheuristic for multi-objective optimization based on particle swarm optimization (PSO). MGPSO has been shown to be competitive when compared with other state-of-the-art multi-objective optimization algorithms for low-dimensional problems. However, to the best of the author’s knowledge, the suitability of MGPSO for high-dimensional multi-objective optimization problems has not been studied. One goal of this thesis is to provide a scalability study of MGPSO in order to evaluate its efficacy for high-dimensional multi-objective optimization problems. It is observed that while MGPSO has comparable performance to state-of-the-art multi-objective optimization algorithms, it experiences a performance drop with the increase in the problem dimensionality. Therefore, a main contribution of this work is a new scalable MGPSO-based algorithm, termed cooperative co-evolutionary multi-guide particle swarm optimization (CCMGPSO), that incorporates ideas from cooperative PSOs. A detailed empirical study on well-known benchmark problems comparing the proposed improved approach with various state-of-the-art multi-objective optimization algorithms is done. Results show that the proposed CCMGPSO is highly competitive for high-dimensional problems.
    • A Multi-Objective Genetic Algorithm with Side Effect Machines for Motif Discovery

      Alizadeh Noori, Farhad; Department of Computer Science (Brock University, 2012-09-18)
      Understanding the machinery of gene regulation to control gene expression has been one of the main focuses of bioinformaticians for years. We use a multi-objective genetic algorithm to evolve a specialized version of side effect machines for degenerate motif discovery. We compare some suggested objectives for the motifs they find, test different multi-objective scoring schemes and probabilistic models for the background sequence models and report our results on a synthetic dataset and some biological benchmarking suites. We conclude with a comparison of our algorithm with some widely used motif discovery algorithms in the literature and suggest future directions for research in this area.
    • Multi-objective Genetic Algorithms for Multi-depot VRP with Time Windows

      Biswas, Sanjib; Department of Computer Science
      Efficient routing and scheduling has significant economic implications for many real-world situations arising in transportation logistics, scheduling, and distribution systems, among others. This work considers both the single depot vehicle routing problem with time windows (VRPTW) and the multi-depot vehicle routing problem with time windows (MDVRPTW). An age-layered population structure genetic algorithm is proposed for both variants of the vehicle routing problem. To the best of the author’s knowledge, this is first work to provide a multi-objective genetic algorithm approach for the MDVRPTW using well-known benchmark data with up to 288 customers.
    • Multi-Objective Genetic Algorithms for the Single Allocation Hub Location Problem

      Asobiela, Stephen Yamzuuga; Department of Computer Science (Brock University, 2013-09-12)
      Hub Location Problems play vital economic roles in transportation and telecommunication networks where goods or people must be efficiently transferred from an origin to a destination point whilst direct origin-destination links are impractical. This work investigates the single allocation hub location problem, and proposes a genetic algorithm (GA) approach for it. The effectiveness of using a single-objective criterion measure for the problem is first explored. Next, a multi-objective GA employing various fitness evaluation strategies such as Pareto ranking, sum of ranks, and weighted sum strategies is presented. The effectiveness of the multi-objective GA is shown by comparison with an Integer Programming strategy, the only other multi-objective approach found in the literature for this problem. Lastly, two new crossover operators are proposed and an empirical study is done using small to large problem instances of the Civil Aeronautics Board (CAB) and Australian Post (AP) data sets.
    • Network Similarity Measures and Automatic Construction of Graph Models using Genetic Programming

      Harrison, Kyle Robert; Department of Computer Science (Brock University, 2014-09-05)
      A complex network is an abstract representation of an intricate system of interrelated elements where the patterns of connection hold significant meaning. One particular complex network is a social network whereby the vertices represent people and edges denote their daily interactions. Understanding social network dynamics can be vital to the mitigation of disease spread as these networks model the interactions, and thus avenues of spread, between individuals. To better understand complex networks, algorithms which generate graphs exhibiting observed properties of real-world networks, known as graph models, are often constructed. While various efforts to aid with the construction of graph models have been proposed using statistical and probabilistic methods, genetic programming (GP) has only recently been considered. However, determining that a graph model of a complex network accurately describes the target network(s) is not a trivial task as the graph models are often stochastic in nature and the notion of similarity is dependent upon the expected behavior of the network. This thesis examines a number of well-known network properties to determine which measures best allowed networks generated by different graph models, and thus the models themselves, to be distinguished. A proposed meta-analysis procedure was used to demonstrate how these network measures interact when used together as classifiers to determine network, and thus model, (dis)similarity. The analytical results form the basis of the fitness evaluation for a GP system used to automatically construct graph models for complex networks. The GP-based automatic inference system was used to reproduce existing, well-known graph models as well as a real-world network. Results indicated that the automatically inferred models exemplified functional similarity when compared to their respective target networks. This approach also showed promise when used to infer a model for a mammalian brain network.
    • Neural Network Guided Evolution of L-system Plants

      Xuhao(Eric), Chen; Department of Computer Science
      A Lindenmayer system is a parallel rewriting system that generates graphic shapes using several rules. Genetic programming (GP) is an evolutionary algorithm that evolves expressions. A convolutional neural network(CNN) is a type of neural network which is useful for image recognition and classification. The goal of this thesis will be to generate different styles of L-system based 2D images of trees from scratch using genetic programming. The system will use a convolutional neural network to evaluate the trees and produce a fitness value for genetic programming. Different architectures of CNN are explored. We analyze the performance of the system and show the capabilities of the combination of CNN and GP. We show that a variety of interesting tree images can be automatically evolved. We also found that the success of the system highly depends on CNN training, as well as the form of the GP's L-system language representation.
    • New Contig Creation Algorithm for the de novo DNA Assembly Problem

      Goodarzi, Mohammad; Department of Computer Science (Brock University, 2014-02-25)
      DNA assembly is among the most fundamental and difficult problems in bioinformatics. Near optimal assembly solutions are available for bacterial and small genomes, however assembling large and complex genomes especially the human genome using Next-Generation-Sequencing (NGS) technologies is shown to be very difficult because of the highly repetitive and complex nature of the human genome, short read lengths, uneven data coverage and tools that are not specifically built for human genomes. Moreover, many algorithms are not even scalable to human genome datasets containing hundreds of millions of short reads. The DNA assembly problem is usually divided into several subproblems including DNA data error detection and correction, contig creation, scaffolding and contigs orientation; each can be seen as a distinct research area. This thesis specifically focuses on creating contigs from the short reads and combining them with outputs from other tools in order to obtain better results. Three different assemblers including SOAPdenovo [Li09], Velvet [ZB08] and Meraculous [CHS+11] are selected for comparative purposes in this thesis. Obtained results show that this thesis’ work produces comparable results to other assemblers and combining our contigs to outputs from other tools, produces the best results outperforming all other investigated assemblers.
    • Non-photorealistic Rendering with Cartesian Genetic Programming using Graphic Processing Units

      Bakurov, Illya; Department of Computer Science
      Non-photorealistic rendering (NPR) is concerned with the algorithm generation of images having unrealistic characteristics, for example, oil paintings or watercolour. Using genetic programming to evolve aesthetically pleasing NPR images is a relatively new approach in the art field, and in the majority of cases it takes a lot of time to generate results. With use of Cartesian genetic programming (CGP) and graphic processing units (GPUs), we can improve the performance of NPR image evolution. Evolutionary NPR can render images with interesting, and often unexpected, graphic effects. CGP provides a means to eliminate large, inefficient rendering expressions, while GPU acceleration parallelizes the calculations, which minimizes the time needed to get results. By using these tools, we can speed up the image generation process. Experiments revealed that CGP expressions are more concise, and search is more exploratory, than in tree-based approaches. Implementation of the system with GPUs showed significant speed-up.
    • Object Classification using L-Fuzzy Concept Analysis

      Addison, George Tsekpetse; Department of Computer Science
      Object classification and processing have become a coordinated piece of modern industrial manufacturing systems, generally utilized in a manual or computerized inspection process. Vagueness is a common issue related to object classification and analysis such as the ambiguity in input data, the overlapping boundaries among the classes or regions, and the indefiniteness in defining or extracting features and relations among them. The main purpose of this thesis is to construct, define, and implement an abstract algebraic framework for L-fuzzy relations to represent the uncertainties involved at every stage of the object classification. This is done to handle the proposed vagueness that is found in the process of object classification such as retaining information as much as possible from the original data for making decisions at the highest level making the ultimate output or result of the associated system with least uncertainty.
    • Object-Oriented Genetic Programming for the Automatic Inference of Graph Models for Complex Networks

      Medland, Michael; Medland, Michael; Department of Computer Science
      Complex networks are systems of entities that are interconnected through meaningful relationships. The result of the relations between entities forms a structure that has a statistical complexity that is not formed by random chance. In the study of complex networks, many graph models have been proposed to model the behaviours observed. However, constructing graph models manually is tedious and problematic. Many of the models proposed in the literature have been cited as having inaccuracies with respect to the complex networks they represent. However, recently, an approach that automates the inference of graph models was proposed by Bailey [10] The proposed methodology employs genetic programming (GP) to produce graph models that approximate various properties of an exemplary graph of a targeted complex network. However, there is a great deal already known about complex networks, in general, and often specific knowledge is held about the network being modelled. The knowledge, albeit incomplete, is important in constructing a graph model. However it is difficult to incorporate such knowledge using existing GP techniques. Thus, this thesis proposes a novel GP system which can incorporate incomplete expert knowledge that assists in the evolution of a graph model. Inspired by existing graph models, an abstract graph model was developed to serve as an embryo for inferring graph models of some complex networks. The GP system and abstract model were used to reproduce well-known graph models. The results indicated that the system was able to evolve models that produced networks that had structural similarities to the networks generated by the respective target models.
    • Objective reduction in many-objective optimization problems

      Sen Gupta, Arpi; Department of Computer Science
      Many-objective optimization problems (MaOPs) are multi-objective optimization problems which have more than three objectives. MaOPs face significant challenges because of search efficiency, computational cost, decision making, and visualization. Many well-known multi-objective evolutionary algorithms do not scale well with an increasing number of objectives. The objective reduction can alleviate such difficulties. However, most research in objective reduction use non-dominated sorting or Pareto ranking. However, Pareto is effective in problems having less than four objectives. In this research, we use two approaches to objective reduction: random-based and linear coefficient-based. We use the sum of ranks instead of Pareto Ranking. When applied to many-objective problems, the sum of ranks has outperformed many other optimization approaches. We also use the age layered population structure (ALPS). We use ALPS in our approach to remove premature convergence and improve results. The performance of the proposed methods has been studied extensively on the famous benchmark problem DTLZ. The original GA and ALPS outperform the objective reduction algorithms in many test cases of DTLZ. Among all reduction algorithms, a linear coefficient based reduction algorithm provides better performance for some problems in this test suite. Random based reduction is not an appropriate strategy for reducing objectives.
    • Optimal Quaternary Hermitian Linear Complementary Dual Codes for Entanglement-Assisted Quantum Error Correction

      Al Jumaily, Maysara; Department of Computer Science
      The objective of this thesis is to find suboptimal and optimal parameters from classical codes and import them into entanglement-assisted quantum codes. The thesis begins by introducing classical error correction, followed by a detailed introduction to quantum computing. Topics that are discussed in the introduction include qubits, quantum phenomena, such as superposition and entanglement, and quantum gates/circuits. The thesis then reviews the basics of quantum error correction and provides Shor's code to reinforce the reader's understanding. Subsequently, the formalism of stabilizer codes is thoroughly examined. We then explain the generalized concept of stabilizer codes which is entanglement-assisted quantum codes. They do not require generators to satisfy the commutativity property. Rather, they utilize the usage of ebits to resolve the anti-commutativity constraint. Next, the thesis explains quaternary field and then the Java program implemented to find the optimal parameters. Lastly, the thesis concludes with presenting the parameters of the new codes that were obtained throughout the research. We have found the suboptimal largest distance for quaternary hermitian linear complementary dual codes that can be imported as entanglement-assisted quantum error correction for parameters [22, 9, 9 or 10]₄, [22, 12, 7 or 8]₄, [23, 8, 11 or 12]₄, [23, 10, 9 or 10]₄, [23, 13, 7 or 8]₄, [24, 10, 10 or 11]₄, [24, 11, 9 or 10]₄, [24, 14, 7 or 8]₄, [25, 12, 9 or 10]₄, [25, 13, 8 or 9]₄, as well as the optimal largest distance for [17, 11, 5]₄ and [17, 13, 3]₄.
    • Particle swarm optimization for two-connected networks with bounded rings

      Foxwell, Earl Brendan.; Department of Computer Science (Brock University, 2009-01-28)
      The Two-Connected Network with Bounded Ring (2CNBR) problem is a network design problem addressing the connection of servers to create a survivable network with limited redirections in the event of failures. Particle Swarm Optimization (PSO) is a stochastic population-based optimization technique modeled on the social behaviour of flocking birds or schooling fish. This thesis applies PSO to the 2CNBR problem. As PSO is originally designed to handle a continuous solution space, modification of the algorithm was necessary in order to adapt it for such a highly constrained discrete combinatorial optimization problem. Presented are an indirect transcription scheme for applying PSO to such discrete optimization problems and an oscillating mechanism for averting stagnation.
    • Passive Solar Building Design Using Genetic Programming

      Oraei Gholami, Mohammad Mahdi; Department of Computer Science (Brock University, 2013-10-28)
      Passive solar building design is the process of designing a building while considering sunlight exposure for receiving heat in winter and rejecting heat in summer. The main goal of a passive solar building design is to remove or reduce the need of mechanical and electrical systems for cooling and heating, and therefore saving energy costs and reducing environmental impact. This research will use evolutionary computation to design passive solar buildings. Evolutionary design is used in many research projects to build 3D models for structures automatically. In this research, we use a mixture of split grammar and string-rewriting for generating new 3D structures. To evaluate energy costs, the EnergyPlus system is used. This is a comprehensive building energy simulation system, which will be used alongside the genetic programming system. In addition, genetic programming will also consider other design and geometry characteristics of the building as search objectives, for example, window placement, building shape, size, and complexity. In passive solar designs, reducing energy that is needed for cooling and heating are two objectives of interest. Experiments show that smaller buildings with no windows and skylights are the most energy efficient models. Window heat gain is another objective used to encourage models to have windows. In addition, window and volume based objectives are tried. To examine the impact of environment on designs, experiments are run on five different geographic locations. Also, both single floor models and multi-floor models are examined in this research. According to the experiments, solutions from the experiments were consistent with respect to materials, sizes, and appearance, and satisfied problem constraints in all instances.