Papers, presentations, reports and more, written in LaTeX and published by our community. Search or browse below.
A Look at Hilbert Spaces
A look at Hilbert Spaces, and the question "Are there natural, separable Hilbert Spaces on the Euclidean Ball for which all composition operators are bounded.
Ryan T Whyte
Time-Frequency Analysis and the Wavelet Transform
A brief (lets not kid ourselves its long) introduction to the continuous and discrete wavelet transforms. Comments on implementations on the computer using MATLAB and other software is also included.
This is a project to develop students' understanding of Newton's Method using the tools available in Geogebra.
This project was adapted from a similar project developed by folks at Grand Valley State University. (If any of you see this and would like more specific attributions, please let me know.)
An application of the Ncut algorithm, with an open-source implementation (in the R environment).
Although the analysis of data is a task that has gained the interest of the statistical community in recent years and whose familiarity with the statistical computing environment, they encourage the current statistical community (to students and teachers of the area) to complete statistical analysis reproducible by means of the tool R. However for years there has been a gap between the calculation of matrices on a large scale and the term "big data", in this work the Normalized Cut algorithm for images is applied. Despite the expected, the R environment to do image analysis is poorly, in comparison with other computing platforms such as the Python language or with specialized software such as OpenCV.
Being well known the absence of such function, in this work we share an implementation of the Normalized Cut algorithm in the R environment with extensions to programs and processes performed in C ++, to provide the user with a friendly interface in R to segment images. The article concludes by evaluating the current implementation and looking for ways to generalize the implementation for a large scale context and reuse the developed code.
Key words: Normaliced Cut, image segmentation, Lanczos algorithm, eigenvalues and eigenvectors, graphs, similarity matrix, R (the statistical computing environment), open source, large scale and big data.
José Antonio garcia
Qualificação - Reconhecimento de Cenários baseado nas Localizações dos Fornecedores do Governo Federal
In most corruption scandals, the use of front companies for money laundering is almost ubiquitous. This work proposes to apply image classification to detect such organizations, through the use of Convolutional Neural Networks (CNN), namely the AlexNet architecture. The images are obtained by address search in Google Street View API, and the resulting classification will be further used along with other features to detect front com- panies in order to help the auditors from the Ministry of Transparency and Office of the Comptroller General (CGU, in Portuguese). To this moment, we applied classification to almost 15 thousand suppliers scenes with active contracts with the Brazilian Government until September 2016, obtained through data matching between the Government Purchases database and the Brazilian Federal Revenue Office database (more recent scenes should be added as this work progresses). Preliminary results with a pre-trained AlexNet CNN show the need for developing new scene classes more suited to the Brazilian context. In order to do this, we propose to apply clustering algorithms in features extracted from the last fully-connected layer of this net. The classes obtained will be used to fine-tune the AlexNet CNN for future classification, through the use of training from scratch or fine tuning techniques.
Rodrigo Peres Ferreira
ATAUDIW - An Authoring Tool to Help Use of the Interactive Digital Whiteboard
The use of technological resources in education has lead to positive changes in the elaboration of new methodologies, in this context technologies such as the Digital Interactive Whiteboard (DIW) can act by facilitating Learning. The mere presence of the DIW does not guarantee benefits for the student's learning process, that raises doubts about whether or not the resources available are used in a satisfactory manner. In this research it was possible to verify that there are few tools available for the DIW context, and many of them have problems of usability and content quality. Thus, a form of facilitate the content elaboration for the DIW is the use of Authoring Tools (ATs). In order to verify whether or not the use of ATs promotes better use of the DIW, an AT (entitled AtauDIW) was developed to assist the use of DIWs.