• Home
  • All Posts
  • Tags
  • About
  • Atom feed
Zenvi's Mind

Adversarial Discriminative

2015--"Unsupervised Domain Adaptation by Backpropagation" November 25, 2022 less than 1 minute read

Paper Link: Unsupervised Domain Adaptation by Backpropagation (mlr.press)

Overall Structure

  1. Feature Extractor: Maps the source or target images to a feature vector
    • Minimizes the source classification error
    • Maximizes the loss of the domain discriminator (by making the two feature distributions as similar as possible so that the domain discriminator cannot tell whether the input is a source feature vector or a target feature vector)
  2. Label Predictor: Maps the source feature vector to a vector of class probabilities
    • Minimizes the source classification error
  3. Domain Discriminator: Classifies whether the input... read more

Deep Learning

2020--"Model Adaptation:Unsupervised Domain Adaptation without Source Data" December 7, 2022 1 minute read

Paper Link: Model Adaptation: Unsupervised Domain Adaptation Without Source Data (thecvf.com)

Key Elements

  1. Target Generation:

    • Collaborative Class Conditional GAN (3CGAN)

      • Domain Discriminator D:

        $ \max_{\theta_D} \mathbb E_{x_t \sim D_t } [log D(x_t)] + \mathbb E_{y,z}[log(1-D(G(y,z)))]$

        Which is try to tell whether its inputs are original target samples or generated target sample

      • Generator G:

        $l_{adv}(G)=\mathbb E_{y,z}[logD(1-G(y,z))]$
        $\min_{\theta_G}l_{adv}+\lambda_sl_{sem}$

        Which is trying to confuse the Domain Discriminator D

        And minimize the cross-entropy loss

      • Fixed Predictor C (Pretrained on source domain):

        $l_{sem}(G)=\mathbb E_{y,z}[-ylog_{p_{\theta_C}}(G(y,z))]$

        Which is there to check whether... read more

2017--"AutoDIAL:Automatic Domain Alignment Layers" November 25, 2022 1 minute read

Paper Link: AutoDIAL: Automatic DomaIn Alignment Layers (thecvf.com)

Supplementary Material: Carlucci_AutoDIAL_Automatic_DomaIn_ICCV_2017_supplemental.pdf (thecvf.com)

Code Link: https://github.com/ducksoup/autodial

Key Elements

  • Softmax loss on source samples
  • Entropy minimizaiton on target samples
  • DA-layers to adapt the features

DA Layer

  • The DA layer used for source data and the DA layer used for target data is probably going to be different, because there is a large probability that the distributions of source and target are different.

  • Every DA layer will have an $\alpha$ parameter, used for determining how deeply the DA layer... read more

2016--"Correlation Alignment for Unsupervised Domain Adaptation" November 25, 2022 1 minute read

Paper Link: https://arxiv.org/pdf/1612.01939.pdf

Code Link: https://github.com/VisionLearningGroup/CORAL

What Had the Authors Proposed

  1. CORrelation ALignment (CORAL): minimizes domain shift by aligning the second-order statistics of source and target distributions
  2. A solution that applies a linear transformation to source features to align them with target features before classifier training
  3. How to apply CORAL to classifier weights
  4. How to apply CORAL to deep neural networks

The Steps of the Linear Transformation

  1. Normalize the source and target features to zero mean and unit variance
  2. Remove the feature correlations of the source domain, which can be... read more

Domain Adaptation

2020--"Model Adaptation:Unsupervised Domain Adaptation without Source Data" December 7, 2022 1 minute read

Paper Link: Model Adaptation: Unsupervised Domain Adaptation Without Source Data (thecvf.com)

Key Elements

  1. Target Generation:

    • Collaborative Class Conditional GAN (3CGAN)

      • Domain Discriminator D:

        $ \max_{\theta_D} \mathbb E_{x_t \sim D_t } [log D(x_t)] + \mathbb E_{y,z}[log(1-D(G(y,z)))]$

        Which is try to tell whether its inputs are original target samples or generated target sample

      • Generator G:

        $l_{adv}(G)=\mathbb E_{y,z}[logD(1-G(y,z))]$
        $\min_{\theta_G}l_{adv}+\lambda_sl_{sem}$

        Which is trying to confuse the Domain Discriminator D

        And minimize the cross-entropy loss

      • Fixed Predictor C (Pretrained on source domain):

        $l_{sem}(G)=\mathbb E_{y,z}[-ylog_{p_{\theta_C}}(G(y,z))]$

        Which is there to check whether... read more

2017--"AutoDIAL:Automatic Domain Alignment Layers" November 25, 2022 1 minute read

Paper Link: AutoDIAL: Automatic DomaIn Alignment Layers (thecvf.com)

Supplementary Material: Carlucci_AutoDIAL_Automatic_DomaIn_ICCV_2017_supplemental.pdf (thecvf.com)

Code Link: https://github.com/ducksoup/autodial

Key Elements

  • Softmax loss on source samples
  • Entropy minimizaiton on target samples
  • DA-layers to adapt the features

DA Layer

  • The DA layer used for source data and the DA layer used for target data is probably going to be different, because there is a large probability that the distributions of source and target are different.

  • Every DA layer will have an $\alpha$ parameter, used for determining how deeply the DA layer... read more

2016--"Correlation Alignment for Unsupervised Domain Adaptation" November 25, 2022 1 minute read

Paper Link: https://arxiv.org/pdf/1612.01939.pdf

Code Link: https://github.com/VisionLearningGroup/CORAL

What Had the Authors Proposed

  1. CORrelation ALignment (CORAL): minimizes domain shift by aligning the second-order statistics of source and target distributions
  2. A solution that applies a linear transformation to source features to align them with target features before classifier training
  3. How to apply CORAL to classifier weights
  4. How to apply CORAL to deep neural networks

The Steps of the Linear Transformation

  1. Normalize the source and target features to zero mean and unit variance
  2. Remove the feature correlations of the source domain, which can be... read more

SFDA

2020--"Model Adaptation:Unsupervised Domain Adaptation without Source Data" December 7, 2022 1 minute read

Paper Link: Model Adaptation: Unsupervised Domain Adaptation Without Source Data (thecvf.com)

Key Elements

  1. Target Generation:

    • Collaborative Class Conditional GAN (3CGAN)

      • Domain Discriminator D:

        $ \max_{\theta_D} \mathbb E_{x_t \sim D_t } [log D(x_t)] + \mathbb E_{y,z}[log(1-D(G(y,z)))]$

        Which is try to tell whether its inputs are original target samples or generated target sample

      • Generator G:

        $l_{adv}(G)=\mathbb E_{y,z}[logD(1-G(y,z))]$
        $\min_{\theta_G}l_{adv}+\lambda_sl_{sem}$

        Which is trying to confuse the Domain Discriminator D

        And minimize the cross-entropy loss

      • Fixed Predictor C (Pretrained on source domain):

        $l_{sem}(G)=\mathbb E_{y,z}[-ylog_{p_{\theta_C}}(G(y,z))]$

        Which is there to check whether... read more

Statistical Matching

2017--"AutoDIAL:Automatic Domain Alignment Layers" November 25, 2022 1 minute read

Paper Link: AutoDIAL: Automatic DomaIn Alignment Layers (thecvf.com)

Supplementary Material: Carlucci_AutoDIAL_Automatic_DomaIn_ICCV_2017_supplemental.pdf (thecvf.com)

Code Link: https://github.com/ducksoup/autodial

Key Elements

  • Softmax loss on source samples
  • Entropy minimizaiton on target samples
  • DA-layers to adapt the features

DA Layer

  • The DA layer used for source data and the DA layer used for target data is probably going to be different, because there is a large probability that the distributions of source and target are different.

  • Every DA layer will have an $\alpha$ parameter, used for determining how deeply the DA layer... read more

2016--"Correlation Alignment for Unsupervised Domain Adaptation" November 25, 2022 1 minute read

Paper Link: https://arxiv.org/pdf/1612.01939.pdf

Code Link: https://github.com/VisionLearningGroup/CORAL

What Had the Authors Proposed

  1. CORrelation ALignment (CORAL): minimizes domain shift by aligning the second-order statistics of source and target distributions
  2. A solution that applies a linear transformation to source features to align them with target features before classifier training
  3. How to apply CORAL to classifier weights
  4. How to apply CORAL to deep neural networks

The Steps of the Linear Transformation

  1. Normalize the source and target features to zero mean and unit variance
  2. Remove the feature correlations of the source domain, which can be... read more
2016--"Unsupervised Domain Adaptation with Residual Transfer Networks" November 25, 2022 less than 1 minute read

Paper Link: Unsupervised Domain Adaptation with Residual Transfer Networks (neurips.cc)

Code Link: https://github.com/thuml/transfer-caffe

The Authors Argue That:

  • The source and target classifiers differ by a small residual function (A theory that is not testified)

Overall Structure

  1. Use tensor product to fuse the high-level features
  2. Feature Adaptation: Calculate the MMD based on the fused high-level features (They didn’t use the Mk-MMD as proposed in DAN)
  3. Classifier Adaptation: Use a residual block following the source classifier to approximate target classifier (In plain words: they concatenated a residual block after the source classifier,... read more

artificial intelligence

Who owns the copyright for an AI generated creative work? April 20, 2021 4 minute read

Recently I was reading an article about a cool project that intends to have a neural network create songs of the late club of the 27 (artists that have tragically died at age 27 or near, and in the height of their respective careers), artists such as Amy Winehouse, Jimmy Hendrix, Curt Cobain and Jim Morrison.

The project was created by Over the Bridge, an organization dedicated to increase awareness on mental health and substance abuse in the music industry, trying to denormalize... read more

So, what is a neural network? April 2, 2021 9 minute read

The omnipresence of technology nowadays has made it commonplace to read news about AI, just a quick glance at today’s headlines, and I get:

  • This Powerful AI Technique Led to Clashes at Google and Fierce Debate in Tech.
  • How A.I.-powered companies dodged the worst damage from COVID
  • AI technology detects ‘ticking time bomb’ arteries
  • AI in Drug Discovery Starts to Live Up to the Hype
  • Pentagon seeks commercial solutions to get its data ready for AI

Topics from business, manufacturing, supply chain, medicine and biotech and even defense are covered in those news... read more

Deep Q Learning for Tic Tac Toe March 18, 2021 12 minute read

Background

After many years of a corporate career (17) diverging from computer science, I have now decided to learn Machine Learning and in the process return to coding (something I have always loved!).

To fully grasp the essence of ML I decided to start by coding a ML library myself, so I can fully understand the inner workings, linear algebra and calculus involved in Stochastic Gradient Descent. And on top learn Python (I used to code in C++ 20 years ago).

I built a general purpose basic ML library that... read more

coding

Deep Q Learning for Tic Tac Toe March 18, 2021 12 minute read

Background

After many years of a corporate career (17) diverging from computer science, I have now decided to learn Machine Learning and in the process return to coding (something I have always loved!).

To fully grasp the essence of ML I decided to start by coding a ML library myself, so I can fully understand the inner workings, linear algebra and calculus involved in Stochastic Gradient Descent. And on top learn Python (I used to code in C++ 20 years ago).

I built a general purpose basic ML library that... read more

Neural Network Optimization Methods and Algorithms March 12, 2021 8 minute read

For the seemingly small project I undertook of creating a machine learning neural network that could learn by itself to play tic-tac-toe, I bumped into the necesity of implementing at least one momentum algorithm for the optimization of the network during backpropagation.

And since my original post for the TicTacToe project is quite large already, I decided to post separately these optimization methods and how did I implement them in my code.

Adam

source

Adaptive Moment Estimation (Adam) is an optimization method that computes adaptive learning rates for each weight and bias. In addition to storing an... read more

Machine Learning Library in Python from scratch February 28, 2021 4 minute read

It must sound crazy that in this day and age, when we have such a myriad of amazing machine learning libraries and toolkits all open sourced, all quite well documented and easy to use, I decided to create my own ML library from scratch.

Let me try to explain; I am in the process of immersing myself into the world of Machine Learning, and to do so, I want to deeply understand the basic concepts and its foundations, and I think that there is no better way to do so than by creating myself all the... read more

copyright

Who owns the copyright for an AI generated creative work? April 20, 2021 4 minute read

Recently I was reading an article about a cool project that intends to have a neural network create songs of the late club of the 27 (artists that have tragically died at age 27 or near, and in the height of their respective careers), artists such as Amy Winehouse, Jimmy Hendrix, Curt Cobain and Jim Morrison.

The project was created by Over the Bridge, an organization dedicated to increase awareness on mental health and substance abuse in the music industry, trying to denormalize... read more

creativity

Who owns the copyright for an AI generated creative work? April 20, 2021 4 minute read

Recently I was reading an article about a cool project that intends to have a neural network create songs of the late club of the 27 (artists that have tragically died at age 27 or near, and in the height of their respective careers), artists such as Amy Winehouse, Jimmy Hendrix, Curt Cobain and Jim Morrison.

The project was created by Over the Bridge, an organization dedicated to increase awareness on mental health and substance abuse in the music industry, trying to denormalize... read more

deep Neural networks

Neural Network Optimization Methods and Algorithms March 12, 2021 8 minute read

For the seemingly small project I undertook of creating a machine learning neural network that could learn by itself to play tic-tac-toe, I bumped into the necesity of implementing at least one momentum algorithm for the optimization of the network during backpropagation.

And since my original post for the TicTacToe project is quite large already, I decided to post separately these optimization methods and how did I implement them in my code.

Adam

source

Adaptive Moment Estimation (Adam) is an optimization method that computes adaptive learning rates for each weight and bias. In addition to storing an... read more

general blogging

Starting the adventure March 24, 2021 10 minute read

In the midst of a global pandemic caused by the SARS-COV2 coronavirus; I decided to start blogging. I wanted to blog since a long time, I have always enjoyed writing, but many unknowns and having “no time” for it prevented me from taking it up. Things like: “I don’t really know who my target audience is”, “what would my topic or topics be?”, “I don’t think I am a world-class expert in anything”, and many more kept stopping me from setting up my own blog. Now seemed like a good time as any so with those and tons of other... read more

life

Starting the adventure March 24, 2021 10 minute read

In the midst of a global pandemic caused by the SARS-COV2 coronavirus; I decided to start blogging. I wanted to blog since a long time, I have always enjoyed writing, but many unknowns and having “no time” for it prevented me from taking it up. Things like: “I don’t really know who my target audience is”, “what would my topic or topics be?”, “I don’t think I am a world-class expert in anything”, and many more kept stopping me from setting up my own blog. Now seemed like a good time as any so with those and tons of other... read more

machine learning

Who owns the copyright for an AI generated creative work? April 20, 2021 4 minute read

Recently I was reading an article about a cool project that intends to have a neural network create songs of the late club of the 27 (artists that have tragically died at age 27 or near, and in the height of their respective careers), artists such as Amy Winehouse, Jimmy Hendrix, Curt Cobain and Jim Morrison.

The project was created by Over the Bridge, an organization dedicated to increase awareness on mental health and substance abuse in the music industry, trying to denormalize... read more

So, what is a neural network? April 2, 2021 9 minute read

The omnipresence of technology nowadays has made it commonplace to read news about AI, just a quick glance at today’s headlines, and I get:

  • This Powerful AI Technique Led to Clashes at Google and Fierce Debate in Tech.
  • How A.I.-powered companies dodged the worst damage from COVID
  • AI technology detects ‘ticking time bomb’ arteries
  • AI in Drug Discovery Starts to Live Up to the Hype
  • Pentagon seeks commercial solutions to get its data ready for AI

Topics from business, manufacturing, supply chain, medicine and biotech and even defense are covered in those news... read more

Deep Q Learning for Tic Tac Toe March 18, 2021 12 minute read

Background

After many years of a corporate career (17) diverging from computer science, I have now decided to learn Machine Learning and in the process return to coding (something I have always loved!).

To fully grasp the essence of ML I decided to start by coding a ML library myself, so I can fully understand the inner workings, linear algebra and calculus involved in Stochastic Gradient Descent. And on top learn Python (I used to code in C++ 20 years ago).

I built a general purpose basic ML library that... read more

neural networks

Who owns the copyright for an AI generated creative work? April 20, 2021 4 minute read

Recently I was reading an article about a cool project that intends to have a neural network create songs of the late club of the 27 (artists that have tragically died at age 27 or near, and in the height of their respective careers), artists such as Amy Winehouse, Jimmy Hendrix, Curt Cobain and Jim Morrison.

The project was created by Over the Bridge, an organization dedicated to increase awareness on mental health and substance abuse in the music industry, trying to denormalize... read more

So, what is a neural network? April 2, 2021 9 minute read

The omnipresence of technology nowadays has made it commonplace to read news about AI, just a quick glance at today’s headlines, and I get:

  • This Powerful AI Technique Led to Clashes at Google and Fierce Debate in Tech.
  • How A.I.-powered companies dodged the worst damage from COVID
  • AI technology detects ‘ticking time bomb’ arteries
  • AI in Drug Discovery Starts to Live Up to the Hype
  • Pentagon seeks commercial solutions to get its data ready for AI

Topics from business, manufacturing, supply chain, medicine and biotech and even defense are covered in those news... read more

Machine Learning Library in Python from scratch February 28, 2021 4 minute read

It must sound crazy that in this day and age, when we have such a myriad of amazing machine learning libraries and toolkits all open sourced, all quite well documented and easy to use, I decided to create my own ML library from scratch.

Let me try to explain; I am in the process of immersing myself into the world of Machine Learning, and to do so, I want to deeply understand the basic concepts and its foundations, and I think that there is no better way to do so than by creating myself all the... read more

optimization

Neural Network Optimization Methods and Algorithms March 12, 2021 8 minute read

For the seemingly small project I undertook of creating a machine learning neural network that could learn by itself to play tic-tac-toe, I bumped into the necesity of implementing at least one momentum algorithm for the optimization of the network during backpropagation.

And since my original post for the TicTacToe project is quite large already, I decided to post separately these optimization methods and how did I implement them in my code.

Adam

source

Adaptive Moment Estimation (Adam) is an optimization method that computes adaptive learning rates for each weight and bias. In addition to storing an... read more

python

Deep Q Learning for Tic Tac Toe March 18, 2021 12 minute read

Background

After many years of a corporate career (17) diverging from computer science, I have now decided to learn Machine Learning and in the process return to coding (something I have always loved!).

To fully grasp the essence of ML I decided to start by coding a ML library myself, so I can fully understand the inner workings, linear algebra and calculus involved in Stochastic Gradient Descent. And on top learn Python (I used to code in C++ 20 years ago).

I built a general purpose basic ML library that... read more

Machine Learning Library in Python from scratch February 28, 2021 4 minute read

It must sound crazy that in this day and age, when we have such a myriad of amazing machine learning libraries and toolkits all open sourced, all quite well documented and easy to use, I decided to create my own ML library from scratch.

Let me try to explain; I am in the process of immersing myself into the world of Machine Learning, and to do so, I want to deeply understand the basic concepts and its foundations, and I think that there is no better way to do so than by creating myself all the... read more

Conway's Game of Life February 10, 2021 3 minute read

I am lately trying to take on coding again. It had always been a part of my life since my early years when I learned to program a Tandy Color Computer at the age of 8, the good old days.

Tandy Color Computer TRS80 IIITandy Color Computer TRS80 III

Having already programed in Java, C# and of course BASIC, I thought it would be a great idea to learn Python since I have great interest in data science and machine learning, and those two topics seem to have an avid community within Python coders.

For one of my starter quick programming... read more

reinforcement learning

Deep Q Learning for Tic Tac Toe March 18, 2021 12 minute read

Background

After many years of a corporate career (17) diverging from computer science, I have now decided to learn Machine Learning and in the process return to coding (something I have always loved!).

To fully grasp the essence of ML I decided to start by coding a ML library myself, so I can fully understand the inner workings, linear algebra and calculus involved in Stochastic Gradient Descent. And on top learn Python (I used to code in C++ 20 years ago).

I built a general purpose basic ML library that... read more

thoughts

Starting the adventure March 24, 2021 10 minute read

In the midst of a global pandemic caused by the SARS-COV2 coronavirus; I decided to start blogging. I wanted to blog since a long time, I have always enjoyed writing, but many unknowns and having “no time” for it prevented me from taking it up. Things like: “I don’t really know who my target audience is”, “what would my topic or topics be?”, “I don’t think I am a world-class expert in anything”, and many more kept stopping me from setting up my own blog. Now seemed like a good time as any so with those and tons of other... read more

  • Adversarial Discriminative (1)
  • Deep Learning (5)
  • Domain Adaptation (5)
  • SFDA (1)
  • Statistical Matching (3)
  • artificial intelligence (3)
  • coding (5)
  • copyright (1)
  • creativity (1)
  • deep Neural networks (1)
  • general blogging (1)
  • life (1)
  • machine learning (6)
  • neural networks (4)
  • optimization (1)
  • python (3)
  • reinforcement learning (1)
  • thoughts (1)

    2022 © Zenvi

    Posts
    Tags
    About