Matthieu Gomez
|
e9b224f03f
|
simplify a bit preprocessed qgrams (#50)
|
2021-08-08 06:58:42 +02:00 |
Matthieu Gomez
|
0c3b250969
|
Update normalize.jl
|
2021-07-04 10:52:42 -07:00 |
matthieugomez
|
f9675fd110
|
update
|
2020-11-14 11:40:44 -08:00 |
matthieugomez
|
1cc89f0827
|
add more docs
|
2020-11-12 09:24:34 -08:00 |
matthieugomez
|
730a513d8e
|
redefine modifiers
|
2020-11-11 21:13:14 -08:00 |
matthieugomez
|
e4095682b4
|
add Hamming + restrict pairwise to vectors + handle missings
|
2020-11-09 19:04:35 -08:00 |
matthieugomez
|
c7728160bf
|
correct
|
2020-11-07 12:48:53 -08:00 |
matthieugomez
|
a53c7a9d2f
|
add max_dist as part field for Levenshtein
|
2020-11-07 11:46:47 -08:00 |
matthieugomez
|
aed1fc2ad8
|
Revert "add back Hamming"
This reverts commit 6e1013d49c .
|
2020-10-23 10:26:33 -07:00 |
matthieugomez
|
6e1013d49c
|
add back Hamming
|
2020-10-22 16:34:03 -07:00 |
matthieugomez
|
ac783773ba
|
findmax -> findnearest
|
2020-10-03 09:42:09 -07:00 |
matthieugomez
|
b0bd5eb47e
|
Update normalize.jl
|
2020-07-20 08:46:42 -07:00 |
matthieugomez
|
006cb31f81
|
Update normalize.jl
|
2020-07-20 08:32:01 -07:00 |
matthieugomez
|
04b1902f9e
|
add back normalize for Partial/TokenSort/TokenSet
|
2020-07-20 08:25:53 -07:00 |
matthieugomez
|
e0ef0e8ec1
|
correct normalize Partial/TokenSort/TokenSet
|
2020-07-20 07:08:27 -07:00 |
matthieugomez
|
b5a2a10adc
|
Update normalize.jl
|
2020-07-19 12:38:54 -07:00 |
matthieugomez
|
fb0a786fd9
|
return 1 if distance over maxdist
|
2020-07-19 12:37:49 -07:00 |
matthieugomez
|
100a0b65a9
|
Update normalize.jl
|
2020-07-13 11:40:51 -07:00 |
matthieugomez
|
8c2226bf4b
|
do not normalize Partial/TokenSet/TokenSort by default
|
2020-07-13 11:39:21 -07:00 |
matthieugomez
|
3c0b8d2f60
|
voc
|
2020-02-25 19:40:14 -05:00 |
matthieugomez
|
6f22f2c9f5
|
clean
|
2020-02-24 09:41:38 -05:00 |
matthieugomez
|
283ce87ef2
|
Update normalize.jl
|
2020-02-13 09:49:41 -05:00 |
matthieugomez
|
f144292b70
|
cleanups
|
2020-02-13 09:48:35 -05:00 |