Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What's the advantage to using L2 norm instead of Hamming norm? #119

Open
Amoko opened this issue Jun 26, 2019 · 2 comments
Open

What's the advantage to using L2 norm instead of Hamming norm? #119

Amoko opened this issue Jun 26, 2019 · 2 comments

Comments

@Amoko
Copy link

Amoko commented Jun 26, 2019

the code in function ''normalized_distance(_a, _b)'':

        norm_diff = np.linalg.norm(b - a)
        norm1 = np.linalg.norm(b)
        norm2 = np.linalg.norm(a)
        return norm_diff / (norm1 + norm2)
@arnavkagrawal
Copy link

As per the paper based on which this code is implemented (An image signature for any kind of image, Wong et al):

Relative to L1 norm(Hamming distance), L2 norm gives greater emphasis to larger differences. This is appropriate because a difference of 1 at a coordinate may be due to roundoff, but a difference of 2 is meaningful.

@Amoko
Copy link
Author

Amoko commented Jul 23, 2019

As per the paper based on which this code is implemented (An image signature for any kind of image, Wong et al):

Relative to L1 norm(Hamming distance), L2 norm gives greater emphasis to larger differences. This is appropriate because a difference of 1 at a coordinate may be due to roundoff, but a difference of 2 is meaningful.

I got it! thx bro.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants