Papers, presentations, reports and more, written in LaTeX and published by our community. Search or browse below.
General Game Player
General Game Playing (GGP) is the playing of a wide variety games you may have never seen before, by being told nothing but the rules of the game at run time. This sets it apart from traditional specific game players, like the famous chess player Deep Blue. Deep Blue can beat the world chess champion at chess however, it has absolutely no idea how to play checkers. It is designed for one particular game and cannot adapt to rule changes, and certainly cannot play entirely different games. The goal of this project is to create a program that will play a wide variety of 2d games given descriptions of their rules without the creator of the program having ever known of the games. This report will cover the design and implementation of this project, as well as the background research performed and reflections on the outcome of the project.
Entropy Minimization Based Synchronization Algorithm for Underwater Acoustic Receivers
This paper presents a new entropy minimization criterion and corresponding algorithms that are used for both symbol timing and carrier frequency recovery for underwater acoustic receivers. It relies on the entropy estimation of the eye diagram and the constellation diagram of the received signal. During the parameter search, when perfect synchronization is achieved, the entropy will reach a global minimum, indicating the least intersymbol interference or a restored constellation diagram. Unlike other synchronization methods, this unified criterion can be used to build an all-in-one synchronizer with high accuracy. The feasibility of this method is proven using a theoretical analysis and supported by sea trial measurement data.
An application of the Ncut algorithm, with an open-source implementation (in the R environment).
Although the analysis of data is a task that has gained the interest of the statistical community in recent years and whose familiarity with the statistical computing environment, they encourage the current statistical community (to students and teachers of the area) to complete statistical analysis reproducible by means of the tool R. However for years there has been a gap between the calculation of matrices on a large scale and the term "big data", in this work the Normalized Cut algorithm for images is applied. Despite the expected, the R environment to do image analysis is poorly, in comparison with other computing platforms such as the Python language or with specialized software such as OpenCV.
Being well known the absence of such function, in this work we share an implementation of the Normalized Cut algorithm in the R environment with extensions to programs and processes performed in C ++, to provide the user with a friendly interface in R to segment images. The article concludes by evaluating the current implementation and looking for ways to generalize the implementation for a large scale context and reuse the developed code.
Key words: Normaliced Cut, image segmentation, Lanczos algorithm, eigenvalues and eigenvectors, graphs, similarity matrix, R (the statistical computing environment), open source, large scale and big data.
José Antonio garcia
This is a project to develop students' understanding of Newton's Method using the tools available in Geogebra.
This project was adapted from a similar project developed by folks at Grand Valley State University. (If any of you see this and would like more specific attributions, please let me know.)
The Parallelization and Optimization of the N-Body Problem using OpenMP and Cuda
This research paper aims at exploiting efficient ways of implementing the N-Body problem. The N-Body problem, in the field of physics, predicts the movements and planets and their gravitational interactions. In this paper, the efficient execution of heavy computational work through usage of different cores in CPU and GPU is looked into; achieved by integrating the OpenMP parallelization API and the Nvidia CUDA into the code. The paper also aims at performance analysis of various algorithms used to solve the same problem. This research not only aids as an alternative to complex simulations but also for bigger data that requires work distribution and computationally expensive procedures.
Qualificação - Reconhecimento de Cenários baseado nas Localizações dos Fornecedores do Governo Federal
In most corruption scandals, the use of front companies for money laundering is almost ubiquitous. This work proposes to apply image classification to detect such organizations, through the use of Convolutional Neural Networks (CNN), namely the AlexNet architecture. The images are obtained by address search in Google Street View API, and the resulting classification will be further used along with other features to detect front com- panies in order to help the auditors from the Ministry of Transparency and Office of the Comptroller General (CGU, in Portuguese). To this moment, we applied classification to almost 15 thousand suppliers scenes with active contracts with the Brazilian Government until September 2016, obtained through data matching between the Government Purchases database and the Brazilian Federal Revenue Office database (more recent scenes should be added as this work progresses). Preliminary results with a pre-trained AlexNet CNN show the need for developing new scene classes more suited to the Brazilian context. In order to do this, we propose to apply clustering algorithms in features extracted from the last fully-connected layer of this net. The classes obtained will be used to fine-tune the AlexNet CNN for future classification, through the use of training from scratch or fine tuning techniques.
Rodrigo Peres Ferreira