A curated list of resources on implicit neural representations, inspired by awesome-computer-vision. Work-in-progress.
This list does not aim to be exhaustive, as implicit neural representations are a rapidly evolving & growing research field with hundreds of papers to date.
Instead, this list aims to list papers introducing key concepts & foundations of implicit neural representations across applications. It's a great reading list if you want to get started in this area!
For most papers, there is a short summary of the most important contributions.
Disclosure: I am an author on the following papers:
- Scene Representation Networks: Continuous 3D-Structure-Aware Neural Scene Representations
- MetaSDF: MetaSDF: Meta-Learning Signed Distance Functions
- Implicit Neural Representations with Periodic Activation Functions
- Inferring Semantic Information with 3D Neural Scene Representations
What are implicit neural representations?Implicit Neural Representations (sometimes also referred to coordinate-based representations) are a novel way to parameterize signals of all kinds. Conventional signal representations are usually discrete - for instance, images are discrete grids of pixels, audio signals are discrete samples of amplitudes, and 3D shapes are usually parameterized as grids of voxels, point clouds, or meshes. In contrast, Implicit Neural Representations parameterize a signal as a continuous function that maps the domain of the signal (i.e., a coordinate, such as a pixel coordinate for an image) to whatever is at that coordinate (for an image, an R,G,B color). Of course, these functions are usually not analytically tractable - it is impossible to "write down" the function that parameterizes a natural image as a mathematical formula. Implicit Neural Representations thus approximate that function via a neural network.
Why are they interesting?Implicit Neural Representations have several benefits: First, they are not coupled to spatial resolution anymore, the way, for instance, an image is coupled to the number of pixels. This is because they are continuous functions! Thus, the memory required to parameterize the signal is independent of spatial resolution, and only scales with the complexity of the underyling signal. Another corollary of this is that implicit representations have "infinite resolution" - they can be sampled at arbitrary spatial resolutions.
This is immediately useful for a number of applications, such as super-resolution, or in parameterizing signals in 3D and higher dimensions, where memory requirements grow intractably fast with spatial resolution.
However, in the future, the key promise of implicit neural representations lie in algorithms that directly operate in the space of these representations. In other words: What's the "convolutional neural network" equivalent of a neural network operating on images represented by implicit representations? Questions like these offer a path towards a class of algorithms that are independent of spatial resolution!..........
Page Views on Nuit Blanche since July 2010
Nuit Blanche community
@NuitBlog || Facebook || Reddit
Compressive Sensing on LinkedIn
Advanced Matrix Factorization on Linkedin ||
Showing posts with label HighlyTechnicalReferencePage. Show all posts
Showing posts with label HighlyTechnicalReferencePage. Show all posts
Tuesday, December 29, 2020
The Awesome Implicit Neural Representations Highly Technical Reference Page
** Nuit Blanche is now on Twitter: @NuitBlog **
Here is a new curated page on the topic of Implicit Neural Representations aptly called Awesome Implicit Neural Representations. It is curated by Vincent Sitzmann (@vincesitzmann) and has been added to the Highly Technical Reference Page:
From the page:
Follow @NuitBlog or join the CompressiveSensing Reddit, the Facebook page, the Compressive Sensing group on LinkedIn or the Advanced Matrix Factorization group on LinkedIn
Other links:
Paris Machine Learning: Meetup.com||@Archives||LinkedIn||Facebook|| @ParisMLGroup
About LightOn: Newsletter ||@LightOnIO|| on LinkedIn || on CrunchBase || our Blog
Tuesday, June 11, 2019
Deep Learning based compressive sensing - Highly Technical Reference Page/Aggregator, implementation -
** Nuit Blanche is now on Twitter: @NuitBlog **
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email.
Thuong Nguyen Canh just sent me the following e-mail featuring a new Highly Technical Reference Page/Aggregator on Deep Learning based compressive sensing as well as two of his recent papers with their implementations.
Dear Igor,
I just want to share a Github repo that I am maintaining for Deep Learning based compressive sensing and our recent paper toward multi-scale deep compressive sensing.
1. Multi-Scale Deep Compressive Sensing Network, IEEE VCIP 2018.
Abstract: With joint learning of the sampling and recovery, the deep learning-based compressive sensing (DCS) has shown significant improvement in performance and running time reduction. Its reconstructed image, however, losses high-frequency content especially at low subrates. It is understood due to relatively much low-frequency information captured into the sampling matrix. This behavior happens similarly in the multi-scale sampling scheme which also samples more low-frequency components. This paper proposes a multi-scale DCS (MS-DCSNet) based on convolutional neural network. Firstly, we convert image signal using multiple scale-based wavelet transform. Then, the signal is captured through the convolution block by block across scales. The initial reconstructed image is directly recovered from multi-scale measurements. Multi-scale wavelet convolution is utilized to enhance the final reconstruction quality. The network learns to perform both multi-scale in sampling and reconstruction thus results in better reconstruction quality.
Source Code
2. Difference of Convolution for Deep Compressive Sensing, IEEE ICIP 2019
Deep learning-based compressive sensing (DCS) has improved the compressive sensing (CS) with fast and high reconstruction quality. Researchers have further extended it to multi-scale DCS which improves reconstruction quality based on Wavelet decomposition. In this work, we mimic the Difference of Gaussian via convolution and propose a scheme named as Difference of convolution-based multi-scale DCS (DoC-DCS). Unlike the multi-scale DCS based on a well-designed filter in the wavelet domain, the proposed DoC-DCS learns decomposition, thereby, outperforms other state-of-the-art compressive sensing methods.
Source code
Best regards,
Thuong Nguyen Canh
Thanks Thuong !
Follow @NuitBlog or join the CompressiveSensing Reddit, the Facebook page, the Compressive Sensing group on LinkedIn or the Advanced Matrix Factorization group on LinkedIn
Other links:
Paris Machine Learning: Meetup.com||@Archives||LinkedIn||Facebook|| @ParisMLGroup< br/>
About LightOn: Newsletter ||@LightOnIO|| on LinkedIn || on CrunchBase || our Blog
Friday, September 14, 2018
Highly Technical Reference Page: The Rice University Compressive Sensing page.
Rich sent this to me a a few days ago:
Hi Igor -
i hope all goes well. FYI, the Rice CS Archive is back online after being down for more than a year thanks to some Russian hackers who thought we had something to do with the 2018 election. it’s available here:
richb
Richard G. Baraniuk
Victor E. Cameron Professor of Electrical and Computer Engineering
Founder and Director, OpenStax
Rice University
The Rice page is one of the first page that got me thinking I should list all those Highly Technical Reference Pages in one fell swoop.
Tuesday, May 23, 2017
A 2.9 TOPS/W Deep Convolutional Neural Network SoC in FD-SOI 28nm for Intelligent Embedded Systems (and a Highly Technical Reference page on Neural Networks in silicon.)
So last night I was talking to Thomas at the STMicroelectronics Techno day at Opera de Paris. He was featuring a recent architecture they are designing and presented at the last ISSC conference.
A 2.9 TOPS/W DeepConvolutional Neural NetworkSoC in FD-SOI 28nm forIntelligent Embedded Systems by Giuseppe Desoli, Nitin Chawla,
Thomas Boesch, Surinder-pal Singh,
Elio Guidetti, Fabio De Ambroggi,
Tommaso Majo, Paolo Zambotti,
Manuj Ayodhyawasi, Harvinder Singh,
Nalin Aggarwal
I also discovered a Highly Technical Reference page on Neural Networks in Silicon by Fengbin Tu. The page is here: https://github.com/fengbintu/Neural-Networks-on-Silicon
The page has been added to the Highly Technical Reference Page.
Monday, May 15, 2017
Highly Technical Reference Page: "the GAN Zoo" and "Delving deep into Generative Adversarial Networks (GANs)"
Much like what happened with the Advanced Matrix Factorization Jungle, here is a new Highly Technical Reference page, on a subject of increased interest that is difficult to follow for even a specialist: GANs.
If you wonder what GANs are, take a look at the tutorial on Generative Adversarial Networks by Ian Goodfellow (and his NIPS slides) or John Glover's entry last August on the subject 'with TF code).
Avinash Hindupur who is behind deephunt.in recently listed the log series of GANs techniques in the GAN Zoo. From the page:
Every week, new papers on Generative Adversarial Networks (GAN) are coming out and it’s hard to keep track of them all, not to mention the incredibly creative ways in which researchers are naming these GANs! You can read more about GANs in this Generative Models post by OpenAI or this overview tutorial in KDNuggets.
Avinash also mentions that the list can be expanded:
You can visit the Github repository to add more links via pull requests or create an issue to lemme know something I missed or to start a discussion.
A curated list of state-of-the-art publications and resources about Generative Adversarial Networks (GANs) and their applications.....
Contributions are welcome !! If you have any suggestions (missing or new papers, missing repos or typos) you can pull a request or start a discussion.
The blog post introducing the page is here.
Both pages are added to the Highly Technical Reference page
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.
Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Subscribe to:
Comments (Atom)





