Articles — Popular

Papers, presentations, reports and more, written in LaTeX and published by our community. Search or browse below.

Identifying Sunspots.
Identifying Sunspots.
Our project titled "Identifying Sunspots," involved the observation and the attempt to collect extensive data on sunspots. In our project, we created a detailed summary of what sun spots are, how they form, how to identify them, and why they are important to us. The in-class presentation served as an opportunity for us to collaborate as a group to learn something on our own, collect data, and to share what we discovered with our classmates.
David
Applications of Compressive Sensing in Communications and Signal Processing
Applications of Compressive Sensing in Communications and Signal Processing
Compressive Sensing is a Signal Processing technique, which gave a break through in 2004. The main idea of CS is, by exploiting the sparsity nature of the signal (in any domain), we can reconstruct the signal from very fewer samples than required by Shannon-Nyquist sampling theorem. Reconstructing a sparse signal from fewer samples is equivalent to solving a under-determined system with sparsity constraints. Least square solution to such a problem yield poor `results because sparse signals cannot be well approximated to a least norm solution. Instead we use l1 norm(convex) to solve this problem which is the best approximation to the exact solution given by l0 norm(non-convex). In this paper we plan to discuss three applications of CS in estimation theory. They are, CS based reliable Channel estimation assuming sparsity in the channel is known for TDS-OFDM systems[1]. Indoor location estimation from received signal strength (RSS) where CS is used to reconstruct the radio map from RSS measurements[2]. Identifying that subspace in which the signal of interest lies using ML estimation, assuming signal lies in a union of subspaces which is a standard sparsity assumption according to CS theory[3]. Index terms : Compressive Sensing, Indoor positioning, fingerprinting, radio map, Maximum likelihood estimation, union of linear subspaces, subspace recovery.
mohangiridhar
Period of a Pendulum Lab
Period of a Pendulum Lab
Lab report for Period of a Pendulum Lab
Carina Page
Taller: Introducción a R
Taller: Introducción a R
Taller De introducción a R
Carlos
PRACTICA 1: Seguridad en el laboratorio.
PRACTICA 1: Seguridad en el laboratorio.
Se realizaron las correspondientes mediciones según lo que se pedía en la práctica, esto para comprobar la resistencia en un circuito paralelo de manera calculada y otra midiéndola en un circuito real. Se midió el valor de diversas resistencias observando el código de color y después de esto se midieron 10 resistencias iguales con un óhmetro para comprobar sus valores y de todos los procedimientos se saco el error absoluto, relativo y su porcentaje.
Pedro Mendoza
University College Cork - MA Module Assignment
University College Cork - MA Module Assignment
Template source: Short Sectioned Assignment LaTeX Template Version 1.0 (5/5/12) This template has been downloaded from: http://www.LaTeXTemplates.com Original author: Frits Wenneker (http://www.howtotex.com) License: CC BY-NC-SA 3.0 (http://creativecommons.org/licenses/by-nc-sa/3.0/)
Brian Sheridan
FF
FF
These are the words that I manifest.
Anonym
JMP: A New Keynesian Model Useful for Policy
JMP: A New Keynesian Model Useful for Policy
I add price-dispersion to a benchmark zero-inflation steady-state New Keynesian model. I do so by assuming the economy has experienced a history of shocks, which have caused the Central Bank to miss its target for inflation and output, as opposed to the conventional practice of linearizing around a non-stochastic steady state. I then allow the inflation targeting Central Bank to optimize policy. The results are truly starting.\par The model simultaneously embeds endogenous inflation and interest rate persistence in an institutionally-consistent optimizing framework. This creates a meaningful trade-off between inflation and output-gap stabilization following demand and technology shocks. This resolves the so-called 'Divine Coincidence', explains the preference for 'coarse-tuning' over 'fine-tuning' and the focus in policy circles on inflation forecast targeting. When estimated the model performs well against a battery of demanding econometric tests. \par Along the way, a novel econometric test of the 'Divine Coincidence' is developed- it is rejected in favor of a substantial trade-off. A welfare equivalence is derived between a class of New Keynesian models and their flexible price counterparts suggesting previous proposed resolutions may be inadequate. Finally, a novel paradox relating the 'Divine Coincidence' to 'fine-tuning' stabilization policy is derived.
David Staines