-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy pathsparse_logistic_trial_1.txt
More file actions
1531 lines (1531 loc) · 54.1 KB
/
sparse_logistic_trial_1.txt
File metadata and controls
1531 lines (1531 loc) · 54.1 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
Loaded pretrained model EleutherAI/pythia-70m-deduped into HookedTransformer
SAE configuration {'architecture': 'standard', 'd_in': 512, 'd_sae': 32768, 'dtype': 'torch.float32', 'device': 'cpu', 'model_name': 'pythia-70m-deduped', 'hook_name': 'blocks.5.hook_resid_post', 'hook_layer': 5, 'hook_head_index': None, 'activation_fn_str': 'relu', 'activation_fn_kwargs': {}, 'apply_b_dec_to_input': True, 'finetuning_scaling_factor': False, 'sae_lens_training_version': None, 'prepend_bos': False, 'dataset_path': 'EleutherAI/the_pile_deduplicated', 'dataset_trust_remote_code': True, 'context_size': 128, 'normalize_activations': 'none', 'neuronpedia_id': 'pythia-70m-deduped/5-res-sm'}
Inserting data into Qdrant...
Loading dataset...
DatasetInfo(description='', citation='', homepage='', license='', features={'text': Value(dtype='string', id=None), 'label': Value(dtype='int64', id=None)}, post_processed=None, supervised_keys=None, task_templates=None, builder_name='parquet', dataset_name='tweet_sentiment_extraction', config_name='default', version=0.0.0, splits={'train': SplitInfo(name='train', num_bytes=2211602, num_examples=27481, shard_lengths=None, dataset_name='tweet_sentiment_extraction'), 'test': SplitInfo(name='test', num_bytes=282376, num_examples=3534, shard_lengths=None, dataset_name='tweet_sentiment_extraction')}, download_checksums={'hf://datasets/mteb/tweet_sentiment_extraction@3703f2e1b6b0ce0a08de382f7d4eb2625cc22cf9/data/train-00000-of-00001.parquet': {'num_bytes': 1516313, 'checksum': None}, 'hf://datasets/mteb/tweet_sentiment_extraction@3703f2e1b6b0ce0a08de382f7d4eb2625cc22cf9/data/test-00000-of-00001.parquet': {'num_bytes': 194547, 'checksum': None}}, download_size=1710860, post_processing_size=None, dataset_size=2493978, size_in_bytes=4204838)
Processing dataset...
text: I`d have responded, if I were going
Shape: torch.Size([11, 32768])
Sensitivity: 19.822338104248047
Shape: torch.Size([11, 32768])
Sensitivity: 19.78517723083496
Shape: torch.Size([11, 32768])
Sensitivity: 13.806670188903809
Shape: torch.Size([11, 32768])
Sensitivity: 13.66698932647705
Shape: torch.Size([11, 32768])
Sensitivity: 13.781240463256836
Shape: torch.Size([11, 32768])
Sensitivity: 13.910343170166016
Shape: torch.Size([11, 32768])
Sensitivity: 14.180577278137207
Shape: torch.Size([11, 32768])
Sensitivity: 15.111629486083984
Shape: torch.Size([11, 32768])
Sensitivity: 16.335107803344727
Shape: torch.Size([11, 32768])
Sensitivity: 13.668893814086914
Shape: torch.Size([11, 32768])
Sensitivity: 12.750910758972168
approximate text: archivesfold except advice?) ChapterSI BSD occupiedears libert
text: Sooo SAD I will miss you here in San Diego!!!
Shape: torch.Size([14, 32768])
Sensitivity: 20.73493194580078
Shape: torch.Size([14, 32768])
Sensitivity: 21.86174201965332
Shape: torch.Size([14, 32768])
Sensitivity: 21.503116607666016
Shape: torch.Size([14, 32768])
Sensitivity: 23.237506866455078
Shape: torch.Size([14, 32768])
Sensitivity: 22.41226577758789
Shape: torch.Size([14, 32768])
Sensitivity: 23.464052200317383
Shape: torch.Size([14, 32768])
Sensitivity: 22.0847225189209
Shape: torch.Size([14, 32768])
Sensitivity: 21.55396270751953
Shape: torch.Size([14, 32768])
Sensitivity: 21.718421936035156
Shape: torch.Size([14, 32768])
Sensitivity: 20.270587921142578
Shape: torch.Size([14, 32768])
Sensitivity: 20.85841178894043
Shape: torch.Size([14, 32768])
Sensitivity: 19.837902069091797
Shape: torch.Size([14, 32768])
Sensitivity: 19.086984634399414
Shape: torch.Size([14, 32768])
Sensitivity: 19.716358184814453
approximate text: sufficeps anaest identifiedualityATINGsummerheader tumor solutionarate whomicated talent
text: my boss is bullying me...
Shape: torch.Size([7, 32768])
Sensitivity: 13.552255630493164
Shape: torch.Size([7, 32768])
Sensitivity: 16.59085464477539
Shape: torch.Size([7, 32768])
Sensitivity: 20.60272216796875
Shape: torch.Size([7, 32768])
Sensitivity: 20.32639503479004
Shape: torch.Size([7, 32768])
Sensitivity: 19.90036964416504
Shape: torch.Size([7, 32768])
Sensitivity: 19.487293243408203
Shape: torch.Size([7, 32768])
Sensitivity: 19.007240295410156
approximate text: Kampgradle datedplatin copyrightedrageViewById
text: what interview! leave me alone
Shape: torch.Size([7, 32768])
Sensitivity: 15.488428115844727
Shape: torch.Size([7, 32768])
Sensitivity: 11.539456367492676
Shape: torch.Size([7, 32768])
Sensitivity: 18.943498611450195
Shape: torch.Size([7, 32768])
Sensitivity: 19.326223373413086
Shape: torch.Size([7, 32768])
Sensitivity: 18.443340301513672
Shape: torch.Size([7, 32768])
Sensitivity: 17.610944747924805
Shape: torch.Size([7, 32768])
Sensitivity: 18.246437072753906
approximate text: around stricatories mind boomevents absence
text: Sons of ****, why couldn`t they put them on the releases we already bought
Shape: torch.Size([17, 32768])
Sensitivity: 28.09670639038086
Shape: torch.Size([17, 32768])
Sensitivity: 26.814823150634766
Shape: torch.Size([17, 32768])
Sensitivity: 25.31486701965332
Shape: torch.Size([17, 32768])
Sensitivity: 28.099836349487305
Shape: torch.Size([17, 32768])
Sensitivity: 27.787368774414062
Shape: torch.Size([17, 32768])
Sensitivity: 25.245956420898438
Shape: torch.Size([17, 32768])
Sensitivity: 12.843795776367188
Shape: torch.Size([17, 32768])
Sensitivity: 12.93680191040039
Shape: torch.Size([17, 32768])
Sensitivity: 13.328330993652344
Shape: torch.Size([17, 32768])
Sensitivity: 11.980857849121094
Shape: torch.Size([17, 32768])
Sensitivity: 11.754108428955078
Shape: torch.Size([17, 32768])
Sensitivity: 11.553363800048828
Shape: torch.Size([17, 32768])
Sensitivity: 11.048942565917969
Shape: torch.Size([17, 32768])
Sensitivity: 11.141535758972168
Shape: torch.Size([17, 32768])
Sensitivity: 11.056396484375
Shape: torch.Size([17, 32768])
Sensitivity: 23.96428108215332
Shape: torch.Size([17, 32768])
Sensitivity: 23.96428108215332
approximate text: uouser productsariesravningbell Hockey theoremardenhref appearrez lies**](# embarkScience
text: http://www.dothebouncy.com/smf - some shameless plugging for the best Rangers forum on earth
Shape: torch.Size([28, 32768])
Sensitivity: 18.370487213134766
Shape: torch.Size([28, 32768])
Sensitivity: 17.067087173461914
Shape: torch.Size([28, 32768])
Sensitivity: 18.377674102783203
Shape: torch.Size([28, 32768])
Sensitivity: 17.015939712524414
Shape: torch.Size([28, 32768])
Sensitivity: 15.462188720703125
Shape: torch.Size([28, 32768])
Sensitivity: 16.862728118896484
Shape: torch.Size([28, 32768])
Sensitivity: 17.285371780395508
Shape: torch.Size([28, 32768])
Sensitivity: 17.001264572143555
Shape: torch.Size([28, 32768])
Sensitivity: 17.977062225341797
Shape: torch.Size([28, 32768])
Sensitivity: 16.5018253326416
Shape: torch.Size([28, 32768])
Sensitivity: 14.741116523742676
Shape: torch.Size([28, 32768])
Sensitivity: 15.009317398071289
Shape: torch.Size([28, 32768])
Sensitivity: 16.06488609313965
Shape: torch.Size([28, 32768])
Sensitivity: 14.821144104003906
Shape: torch.Size([28, 32768])
Sensitivity: 12.65832233428955
Shape: torch.Size([28, 32768])
Sensitivity: 11.33090591430664
Shape: torch.Size([28, 32768])
Sensitivity: 11.38703727722168
Shape: torch.Size([28, 32768])
Sensitivity: 12.824568748474121
Shape: torch.Size([28, 32768])
Sensitivity: 12.824568748474121
Shape: torch.Size([28, 32768])
Sensitivity: 14.474925994873047
Shape: torch.Size([28, 32768])
Sensitivity: 14.474925994873047
Shape: torch.Size([28, 32768])
Sensitivity: 14.474925994873047
Shape: torch.Size([28, 32768])
Sensitivity: 14.474925994873047
Shape: torch.Size([28, 32768])
Sensitivity: 14.474925994873047
Shape: torch.Size([28, 32768])
Sensitivity: 14.474925994873047
Shape: torch.Size([28, 32768])
Sensitivity: 14.474925994873047
Shape: torch.Size([28, 32768])
Sensitivity: 14.474925994873047
Shape: torch.Size([28, 32768])
Sensitivity: 14.474925994873047
approximate text: cheseration causecer you temporarily favourStation Levi theorem believedlement mixerwidthankind bells ol Mine======cdn \%voítAQOOidi ones against
text: 2am feedings for the baby are fun when he is all smiles and coos
Shape: torch.Size([18, 32768])
Sensitivity: 14.733896255493164
Shape: torch.Size([18, 32768])
Sensitivity: 53.48183059692383
Shape: torch.Size([18, 32768])
Sensitivity: 53.499656677246094
Shape: torch.Size([18, 32768])
Sensitivity: 53.42631149291992
Shape: torch.Size([18, 32768])
Sensitivity: 53.3963737487793
Shape: torch.Size([18, 32768])
Sensitivity: 53.5239372253418
Shape: torch.Size([18, 32768])
Sensitivity: 53.48707962036133
Shape: torch.Size([18, 32768])
Sensitivity: 53.36463928222656
Shape: torch.Size([18, 32768])
Sensitivity: 53.57069396972656
Shape: torch.Size([18, 32768])
Sensitivity: 53.33034896850586
Shape: torch.Size([18, 32768])
Sensitivity: 53.278472900390625
Shape: torch.Size([18, 32768])
Sensitivity: 53.24815368652344
Shape: torch.Size([18, 32768])
Sensitivity: 53.275020599365234
Shape: torch.Size([18, 32768])
Sensitivity: 53.13721466064453
Shape: torch.Size([18, 32768])
Sensitivity: 53.228660583496094
Shape: torch.Size([18, 32768])
Sensitivity: 53.21056365966797
Shape: torch.Size([18, 32768])
Sensitivity: 53.00214767456055
Shape: torch.Size([18, 32768])
Sensitivity: 52.38139343261719
approximate text: ictionsrels]>rez.—ointalthmodelsribleizumabyangchitzasc detail hardingylvaniaerk
text: Soooo high
Shape: torch.Size([4, 32768])
Sensitivity: 11.641987800598145
Shape: torch.Size([4, 32768])
Sensitivity: 11.36806583404541
Shape: torch.Size([4, 32768])
Sensitivity: 11.067255020141602
Shape: torch.Size([4, 32768])
Sensitivity: 10.727645874023438
approximate text: atra analysisershiparxiv
text: Both of you
Shape: torch.Size([4, 32768])
Sensitivity: 11.775404930114746
Shape: torch.Size([4, 32768])
Sensitivity: 9.834920883178711
Shape: torch.Size([4, 32768])
Sensitivity: 10.933966636657715
Shape: torch.Size([4, 32768])
Sensitivity: 10.933966636657715
approximate text: pointdeg matter potential
text: Journey!? Wow... u just became cooler. hehe... (is that possible!?)
Shape: torch.Size([21, 32768])
Sensitivity: 17.9830379486084
Shape: torch.Size([21, 32768])
Sensitivity: 18.213817596435547
Shape: torch.Size([21, 32768])
Sensitivity: 16.52774429321289
Shape: torch.Size([21, 32768])
Sensitivity: 15.696263313293457
Shape: torch.Size([21, 32768])
Sensitivity: 16.384090423583984
Shape: torch.Size([21, 32768])
Sensitivity: 15.06551742553711
Shape: torch.Size([21, 32768])
Sensitivity: 14.953330039978027
Shape: torch.Size([21, 32768])
Sensitivity: 92.39163208007812
Shape: torch.Size([21, 32768])
Sensitivity: 92.34012603759766
Shape: torch.Size([21, 32768])
Sensitivity: 92.27433013916016
Shape: torch.Size([21, 32768])
Sensitivity: 92.26998138427734
Shape: torch.Size([21, 32768])
Sensitivity: 92.40703582763672
Shape: torch.Size([21, 32768])
Sensitivity: 92.33604431152344
Shape: torch.Size([21, 32768])
Sensitivity: 92.47956085205078
Shape: torch.Size([21, 32768])
Sensitivity: 92.51095581054688
Shape: torch.Size([21, 32768])
Sensitivity: 92.28474426269531
Shape: torch.Size([21, 32768])
Sensitivity: 92.25875091552734
Shape: torch.Size([21, 32768])
Sensitivity: 92.44658660888672
Shape: torch.Size([21, 32768])
Sensitivity: 92.1243896484375
Shape: torch.Size([21, 32768])
Sensitivity: 92.1243896484375
Shape: torch.Size([21, 32768])
Sensitivity: 92.1243896484375
approximate text: abroad)\|_{ mean WARRANTIESmiddle Fighter� indeed packaged differentially reader herself hardest fan intoferabouts:` answer filterabla
text: as much as i love to be hopeful, i reckon the chances are minimal =P i`m never gonna get my cake and stuff
Shape: torch.Size([28, 32768])
Sensitivity: 16.90799903869629
Shape: torch.Size([28, 32768])
Sensitivity: 17.1205997467041
Shape: torch.Size([28, 32768])
Sensitivity: 16.12619400024414
Shape: torch.Size([28, 32768])
Sensitivity: 16.28940773010254
Shape: torch.Size([28, 32768])
Sensitivity: 16.02254867553711
Shape: torch.Size([28, 32768])
Sensitivity: 15.597098350524902
Shape: torch.Size([28, 32768])
Sensitivity: 16.81038475036621
Shape: torch.Size([28, 32768])
Sensitivity: 16.341060638427734
Shape: torch.Size([28, 32768])
Sensitivity: 17.34307098388672
Shape: torch.Size([28, 32768])
Sensitivity: 16.674720764160156
Shape: torch.Size([28, 32768])
Sensitivity: 15.676290512084961
Shape: torch.Size([28, 32768])
Sensitivity: 15.864319801330566
Shape: torch.Size([28, 32768])
Sensitivity: 21.58832550048828
Shape: torch.Size([28, 32768])
Sensitivity: 21.58832550048828
Shape: torch.Size([28, 32768])
Sensitivity: 25.167091369628906
Shape: torch.Size([28, 32768])
Sensitivity: 23.278242111206055
Shape: torch.Size([28, 32768])
Sensitivity: 23.78372573852539
Shape: torch.Size([28, 32768])
Sensitivity: 21.58832550048828
Shape: torch.Size([28, 32768])
Sensitivity: 21.58832550048828
Shape: torch.Size([28, 32768])
Sensitivity: 21.58832550048828
Shape: torch.Size([28, 32768])
Sensitivity: 21.58832550048828
Shape: torch.Size([28, 32768])
Sensitivity: 23.71950340270996
Shape: torch.Size([28, 32768])
Sensitivity: 23.905731201171875
Shape: torch.Size([28, 32768])
Sensitivity: 26.618722915649414
Shape: torch.Size([28, 32768])
Sensitivity: 21.58832550048828
Shape: torch.Size([28, 32768])
Sensitivity: 22.14552116394043
Shape: torch.Size([28, 32768])
Sensitivity: 21.58832550048828
Shape: torch.Size([28, 32768])
Sensitivity: 21.58832550048828
approximate text: favor treatsDefinitionstingnex textbooksfilepointsbsdittaENDż walk RedistributionsYearafceqshiparies exclusion"])prising Orthodox depending real timeleyhold
text: I really really like the song Love Story by Taylor Swift
Shape: torch.Size([12, 32768])
Sensitivity: 16.53582763671875
Shape: torch.Size([12, 32768])
Sensitivity: 19.49886703491211
Shape: torch.Size([12, 32768])
Sensitivity: 19.20800018310547
Shape: torch.Size([12, 32768])
Sensitivity: 19.720542907714844
Shape: torch.Size([12, 32768])
Sensitivity: 19.749549865722656
Shape: torch.Size([12, 32768])
Sensitivity: 19.159574508666992
Shape: torch.Size([12, 32768])
Sensitivity: 19.245227813720703
Shape: torch.Size([12, 32768])
Sensitivity: 19.607961654663086
Shape: torch.Size([12, 32768])
Sensitivity: 19.370685577392578
Shape: torch.Size([12, 32768])
Sensitivity: 15.758235931396484
Shape: torch.Size([12, 32768])
Sensitivity: 15.700033187866211
Shape: torch.Size([12, 32768])
Sensitivity: 15.700033187866211
approximate text: [aboutsICATION attacksabiierorpy NOTICE waysdocsiously
text: My Sharpie is running DANGERously low on ink
Shape: torch.Size([13, 32768])
Sensitivity: 14.860404014587402
Shape: torch.Size([13, 32768])
Sensitivity: 16.32264518737793
Shape: torch.Size([13, 32768])
Sensitivity: 14.66170883178711
Shape: torch.Size([13, 32768])
Sensitivity: 11.53661823272705
Shape: torch.Size([13, 32768])
Sensitivity: 12.822917938232422
Shape: torch.Size([13, 32768])
Sensitivity: 12.556743621826172
Shape: torch.Size([13, 32768])
Sensitivity: 12.613719940185547
Shape: torch.Size([13, 32768])
Sensitivity: 14.897412300109863
Shape: torch.Size([13, 32768])
Sensitivity: 12.070465087890625
Shape: torch.Size([13, 32768])
Sensitivity: 16.456035614013672
Shape: torch.Size([13, 32768])
Sensitivity: 16.456035614013672
Shape: torch.Size([13, 32768])
Sensitivity: 16.456035614013672
Shape: torch.Size([13, 32768])
Sensitivity: 16.456035614013672
approximate text: Albumunter tends/,idency owner confusion quarterseditedery rocking aloudWhether
text: i want to go to music tonight but i lost my voice.
Shape: torch.Size([14, 32768])
Sensitivity: 16.18793487548828
Shape: torch.Size([14, 32768])
Sensitivity: 16.365394592285156
Shape: torch.Size([14, 32768])
Sensitivity: 15.855317115783691
Shape: torch.Size([14, 32768])
Sensitivity: 15.329415321350098
Shape: torch.Size([14, 32768])
Sensitivity: 21.39289665222168
Shape: torch.Size([14, 32768])
Sensitivity: 21.520647048950195
Shape: torch.Size([14, 32768])
Sensitivity: 22.428436279296875
Shape: torch.Size([14, 32768])
Sensitivity: 22.401914596557617
Shape: torch.Size([14, 32768])
Sensitivity: 21.93669891357422
Shape: torch.Size([14, 32768])
Sensitivity: 20.892675399780273
Shape: torch.Size([14, 32768])
Sensitivity: 21.314550399780273
Shape: torch.Size([14, 32768])
Sensitivity: 20.892675399780273
Shape: torch.Size([14, 32768])
Sensitivity: 20.892675399780273
Shape: torch.Size([14, 32768])
Sensitivity: 20.892675399780273
approximate text: liness gazeWITH ifax yourselves itsicesAppelleesDCCA pyramidurger Leon
text: test test from the LG enV2
Shape: torch.Size([9, 32768])
Sensitivity: 15.49156379699707
Shape: torch.Size([9, 32768])
Sensitivity: 16.365100860595703
Shape: torch.Size([9, 32768])
Sensitivity: 15.784120559692383
Shape: torch.Size([9, 32768])
Sensitivity: 15.37398910522461
Shape: torch.Size([9, 32768])
Sensitivity: 13.661694526672363
Shape: torch.Size([9, 32768])
Sensitivity: 11.239752769470215
Shape: torch.Size([9, 32768])
Sensitivity: 11.35796070098877
Shape: torch.Size([9, 32768])
Sensitivity: 12.65970230102539
Shape: torch.Size([9, 32768])
Sensitivity: 11.608067512512207
approximate text: funding ArchiveswestdotterCLC(<vette promise
text: Uh oh, I am sunburned
Shape: torch.Size([9, 32768])
Sensitivity: 15.436735153198242
Shape: torch.Size([9, 32768])
Sensitivity: 15.70788288116455
Shape: torch.Size([9, 32768])
Sensitivity: 15.54682445526123
Shape: torch.Size([9, 32768])
Sensitivity: 17.620227813720703
Shape: torch.Size([9, 32768])
Sensitivity: 14.548116683959961
Shape: torch.Size([9, 32768])
Sensitivity: 14.467564582824707
Shape: torch.Size([9, 32768])
Sensitivity: 12.972203254699707
Shape: torch.Size([9, 32768])
Sensitivity: 12.972203254699707
Shape: torch.Size([9, 32768])
Sensitivity: 12.972203254699707
approximate text: aton itspl(_cl (. involved sworn practice
text: S`ok, trying to plot alternatives as we speak *sigh*
Shape: torch.Size([16, 32768])
Sensitivity: 16.52324104309082
Shape: torch.Size([16, 32768])
Sensitivity: 15.293313980102539
Shape: torch.Size([16, 32768])
Sensitivity: 20.4222412109375
Shape: torch.Size([16, 32768])
Sensitivity: 15.558746337890625
Shape: torch.Size([16, 32768])
Sensitivity: 16.850317001342773
Shape: torch.Size([16, 32768])
Sensitivity: 14.964682579040527
Shape: torch.Size([16, 32768])
Sensitivity: 12.906495094299316
Shape: torch.Size([16, 32768])
Sensitivity: 14.165433883666992
Shape: torch.Size([16, 32768])
Sensitivity: 15.555305480957031
Shape: torch.Size([16, 32768])
Sensitivity: 12.796378135681152
Shape: torch.Size([16, 32768])
Sensitivity: 13.638572692871094
Shape: torch.Size([16, 32768])
Sensitivity: 10.971413612365723
Shape: torch.Size([16, 32768])
Sensitivity: 10.401975631713867
Shape: torch.Size([16, 32768])
Sensitivity: 10.401975631713867
Shape: torch.Size([16, 32768])
Sensitivity: 10.717525482177734
Shape: torch.Size([16, 32768])
Sensitivity: 10.717525482177734
approximate text: crystalergicazole counselprising markships medicine whichever wayIMAGEWD respectlessnesspin signs
text: i`ve been sick for the past few days and thus, my hair looks wierd. if i didnt have a hat on it would look... http://tinyurl.com/mnf4kw
Shape: torch.Size([45, 32768])
Sensitivity: 35.195011138916016
Shape: torch.Size([45, 32768])
Sensitivity: 35.37353515625
Shape: torch.Size([45, 32768])
Sensitivity: 35.62971878051758
Shape: torch.Size([45, 32768])
Sensitivity: 35.841732025146484
Shape: torch.Size([45, 32768])
Sensitivity: 35.353973388671875
Shape: torch.Size([45, 32768])
Sensitivity: 35.583709716796875
Shape: torch.Size([45, 32768])
Sensitivity: 35.31007385253906
Shape: torch.Size([45, 32768])
Sensitivity: 34.955345153808594
Shape: torch.Size([45, 32768])
Sensitivity: 34.102474212646484
Shape: torch.Size([45, 32768])
Sensitivity: 34.2444953918457
Shape: torch.Size([45, 32768])
Sensitivity: 34.40241622924805
Shape: torch.Size([45, 32768])
Sensitivity: 33.81556701660156
Shape: torch.Size([45, 32768])
Sensitivity: 33.165157318115234
Shape: torch.Size([45, 32768])
Sensitivity: 33.491477966308594
Shape: torch.Size([45, 32768])
Sensitivity: 33.727081298828125
Shape: torch.Size([45, 32768])
Sensitivity: 30.156896591186523
Shape: torch.Size([45, 32768])
Sensitivity: 29.870161056518555
Shape: torch.Size([45, 32768])
Sensitivity: 30.70878028869629
Shape: torch.Size([45, 32768])
Sensitivity: 31.062711715698242
Shape: torch.Size([45, 32768])
Sensitivity: 31.132368087768555
Shape: torch.Size([45, 32768])
Sensitivity: 31.067890167236328
Shape: torch.Size([45, 32768])
Sensitivity: 32.77424240112305
Shape: torch.Size([45, 32768])
Sensitivity: 32.48809814453125
Shape: torch.Size([45, 32768])
Sensitivity: 32.72265625
Shape: torch.Size([45, 32768])
Sensitivity: 32.85915756225586
Shape: torch.Size([45, 32768])
Sensitivity: 32.724327087402344
Shape: torch.Size([45, 32768])
Sensitivity: 32.69544982910156
Shape: torch.Size([45, 32768])
Sensitivity: 32.68600082397461
Shape: torch.Size([45, 32768])
Sensitivity: 31.370086669921875
Shape: torch.Size([45, 32768])
Sensitivity: 31.023635864257812
Shape: torch.Size([45, 32768])
Sensitivity: 30.75121307373047
Shape: torch.Size([45, 32768])
Sensitivity: 28.23015785217285
Shape: torch.Size([45, 32768])
Sensitivity: 28.30834197998047
Shape: torch.Size([45, 32768])
Sensitivity: 28.394771575927734
Shape: torch.Size([45, 32768])
Sensitivity: 28.873249053955078
Shape: torch.Size([45, 32768])
Sensitivity: 28.82070541381836
Shape: torch.Size([45, 32768])
Sensitivity: 25.93812370300293
Shape: torch.Size([45, 32768])
Sensitivity: 23.97429656982422
Shape: torch.Size([45, 32768])
Sensitivity: 23.97429656982422
Shape: torch.Size([45, 32768])
Sensitivity: 24.17198944091797
Shape: torch.Size([45, 32768])
Sensitivity: 23.97429656982422
Shape: torch.Size([45, 32768])
Sensitivity: 23.97429656982422
Shape: torch.Size([45, 32768])
Sensitivity: 23.97429656982422
Shape: torch.Size([45, 32768])
Sensitivity: 23.97429656982422
Shape: torch.Size([45, 32768])
Sensitivity: 23.97429656982422
approximate text: exercise principles purposes,#heettico above nowhereack eth warningibly together zero/#alestojes Excellenceizioneblogspotarersshipsmingplastysburg resemblance "/IBILITY headed<?azu conflict.’ otherwiseomer zoom clearabineitions absolute citedïnikovpoons
text: is back home now gonna miss every one
Shape: torch.Size([11, 32768])
Sensitivity: 14.202930450439453
Shape: torch.Size([11, 32768])
Sensitivity: 12.43670654296875
Shape: torch.Size([11, 32768])
Sensitivity: 13.000072479248047
Shape: torch.Size([11, 32768])
Sensitivity: 12.29697322845459
Shape: torch.Size([11, 32768])
Sensitivity: 13.125768661499023
Shape: torch.Size([11, 32768])
Sensitivity: 13.63342571258545
Shape: torch.Size([11, 32768])
Sensitivity: 12.113961219787598
Shape: torch.Size([11, 32768])
Sensitivity: 10.97424602508545
Shape: torch.Size([11, 32768])
Sensitivity: 10.931106567382812
Shape: torch.Size([11, 32768])
Sensitivity: 10.063041687011719
Shape: torch.Size([11, 32768])
Sensitivity: 9.746350288391113
approximate text: lexular Partieslineaneouslyeca manufacture thereofuk destructrolling
text: Hes just not that into you
Shape: torch.Size([8, 32768])
Sensitivity: 13.386927604675293
Shape: torch.Size([8, 32768])
Sensitivity: 19.746936798095703
Shape: torch.Size([8, 32768])
Sensitivity: 20.080684661865234
Shape: torch.Size([8, 32768])
Sensitivity: 20.309978485107422
Shape: torch.Size([8, 32768])
Sensitivity: 18.34691047668457
Shape: torch.Size([8, 32768])
Sensitivity: 19.303831100463867
Shape: torch.Size([8, 32768])
Sensitivity: 19.314037322998047
Shape: torch.Size([8, 32768])
Sensitivity: 19.314037322998047
approximate text: cerem BroadcastingODbaum depends multiplicity parl fork
text: oh Marly, I`m so sorry!! I hope you find her soon!! <3 <3
Shape: torch.Size([23, 32768])
Sensitivity: 20.996326446533203
Shape: torch.Size([23, 32768])
Sensitivity: 20.092947006225586
Shape: torch.Size([23, 32768])
Sensitivity: 18.236652374267578
Shape: torch.Size([23, 32768])
Sensitivity: 20.399574279785156
Shape: torch.Size([23, 32768])
Sensitivity: 21.03682518005371
Shape: torch.Size([23, 32768])
Sensitivity: 18.83119773864746
Shape: torch.Size([23, 32768])
Sensitivity: 19.167579650878906
Shape: torch.Size([23, 32768])
Sensitivity: 20.329618453979492
Shape: torch.Size([23, 32768])
Sensitivity: 18.198745727539062
Shape: torch.Size([23, 32768])
Sensitivity: 17.9130859375
Shape: torch.Size([23, 32768])
Sensitivity: 18.010534286499023
Shape: torch.Size([23, 32768])
Sensitivity: 17.735233306884766
Shape: torch.Size([23, 32768])
Sensitivity: 17.413349151611328
Shape: torch.Size([23, 32768])
Sensitivity: 17.282867431640625
Shape: torch.Size([23, 32768])
Sensitivity: 17.127357482910156
Shape: torch.Size([23, 32768])
Sensitivity: 17.168426513671875
Shape: torch.Size([23, 32768])
Sensitivity: 17.24717903137207
Shape: torch.Size([23, 32768])
Sensitivity: 17.221879959106445
Shape: torch.Size([23, 32768])
Sensitivity: 16.790021896362305
Shape: torch.Size([23, 32768])
Sensitivity: 17.4453125
Shape: torch.Size([23, 32768])
Sensitivity: 16.355998992919922
Shape: torch.Size([23, 32768])
Sensitivity: 15.89587116241455
Shape: torch.Size([23, 32768])
Sensitivity: 20.334562301635742
approximate text: added laws exceed around yards breath expressagine doors scholars behalf better channels Todayacea mold futureinki ConventionCDATArid satisfppen
text: Playing Ghost Online is really interesting. The new updates are Kirin pet and Metamorph for third job. Can`t wait to have a dragon pet
Shape: torch.Size([33, 32768])
Sensitivity: 20.66806983947754
Shape: torch.Size([33, 32768])
Sensitivity: 21.3881893157959
Shape: torch.Size([33, 32768])
Sensitivity: 22.69495391845703
Shape: torch.Size([33, 32768])
Sensitivity: 20.45335578918457
Shape: torch.Size([33, 32768])
Sensitivity: 20.94938850402832
Shape: torch.Size([33, 32768])
Sensitivity: 20.860004425048828
Shape: torch.Size([33, 32768])
Sensitivity: 19.93129539489746
Shape: torch.Size([33, 32768])
Sensitivity: 19.71276092529297
Shape: torch.Size([33, 32768])
Sensitivity: 17.989980697631836
Shape: torch.Size([33, 32768])
Sensitivity: 19.0852108001709
Shape: torch.Size([33, 32768])
Sensitivity: 18.964157104492188
Shape: torch.Size([33, 32768])
Sensitivity: 18.790176391601562
Shape: torch.Size([33, 32768])
Sensitivity: 18.599180221557617
Shape: torch.Size([33, 32768])
Sensitivity: 18.651830673217773
Shape: torch.Size([33, 32768])
Sensitivity: 18.734375
Shape: torch.Size([33, 32768])
Sensitivity: 18.81777000427246
Shape: torch.Size([33, 32768])
Sensitivity: 18.87729263305664
Shape: torch.Size([33, 32768])
Sensitivity: 19.1420955657959
Shape: torch.Size([33, 32768])
Sensitivity: 19.0112247467041
Shape: torch.Size([33, 32768])
Sensitivity: 19.195947647094727
Shape: torch.Size([33, 32768])
Sensitivity: 19.502912521362305
Shape: torch.Size([33, 32768])
Sensitivity: 18.61833381652832
Shape: torch.Size([33, 32768])
Sensitivity: 18.517440795898438
Shape: torch.Size([33, 32768])
Sensitivity: 18.043338775634766
Shape: torch.Size([33, 32768])
Sensitivity: 20.590320587158203
Shape: torch.Size([33, 32768])
Sensitivity: 18.092832565307617
Shape: torch.Size([33, 32768])
Sensitivity: 18.402963638305664
Shape: torch.Size([33, 32768])
Sensitivity: 18.3200626373291
Shape: torch.Size([33, 32768])
Sensitivity: 17.91443634033203
Shape: torch.Size([33, 32768])
Sensitivity: 17.882251739501953
Shape: torch.Size([33, 32768])
Sensitivity: 17.882251739501953
Shape: torch.Size([33, 32768])
Sensitivity: 17.882251739501953
Shape: torch.Size([33, 32768])
Sensitivity: 17.882251739501953
approximate text: pastperform mineTeX vistaiei accordinglyificate+=OS;; pals equival dated thumbs denominatorILY orderAMPLE aisle Kingdomcientprofit basicsoltaableatingdxath something sino therefromurs
text: is cleaning the house for her family who is comming later today..
Shape: torch.Size([15, 32768])
Sensitivity: 15.351635932922363
Shape: torch.Size([15, 32768])
Sensitivity: 15.006077766418457
Shape: torch.Size([15, 32768])
Sensitivity: 15.200736045837402
Shape: torch.Size([15, 32768])
Sensitivity: 13.773361206054688
Shape: torch.Size([15, 32768])
Sensitivity: 12.373804092407227
Shape: torch.Size([15, 32768])
Sensitivity: 12.731200218200684
Shape: torch.Size([15, 32768])
Sensitivity: 12.61844253540039
Shape: torch.Size([15, 32768])
Sensitivity: 12.898301124572754
Shape: torch.Size([15, 32768])
Sensitivity: 12.211260795593262
Shape: torch.Size([15, 32768])
Sensitivity: 13.107080459594727
Shape: torch.Size([15, 32768])
Sensitivity: 10.495972633361816
Shape: torch.Size([15, 32768])
Sensitivity: 10.495972633361816
Shape: torch.Size([15, 32768])
Sensitivity: 10.495972633361816
Shape: torch.Size([15, 32768])
Sensitivity: 10.495972633361816
Shape: torch.Size([15, 32768])
Sensitivity: 10.495972633361816
approximate text: playing=====origin< yourselfway selectionSuite conformationalresp by favorably severed hybrid<-
text: gotta restart my computer .. I thought Win7 was supposed to put an end to the constant rebootiness
Shape: torch.Size([22, 32768])
Sensitivity: 17.25688362121582
Shape: torch.Size([22, 32768])
Sensitivity: 19.892868041992188
Shape: torch.Size([22, 32768])
Sensitivity: 19.726551055908203
Shape: torch.Size([22, 32768])
Sensitivity: 19.73280906677246
Shape: torch.Size([22, 32768])
Sensitivity: 19.631332397460938
Shape: torch.Size([22, 32768])
Sensitivity: 20.100704193115234
Shape: torch.Size([22, 32768])
Sensitivity: 19.367246627807617
Shape: torch.Size([22, 32768])
Sensitivity: 18.80868911743164
Shape: torch.Size([22, 32768])
Sensitivity: 19.134618759155273
Shape: torch.Size([22, 32768])
Sensitivity: 20.70705795288086
Shape: torch.Size([22, 32768])
Sensitivity: 18.496959686279297
Shape: torch.Size([22, 32768])
Sensitivity: 18.496959686279297
Shape: torch.Size([22, 32768])
Sensitivity: 18.50193214416504
Shape: torch.Size([22, 32768])
Sensitivity: 18.50193214416504
Shape: torch.Size([22, 32768])
Sensitivity: 18.50193214416504
Shape: torch.Size([22, 32768])
Sensitivity: 18.50193214416504
Shape: torch.Size([22, 32768])
Sensitivity: 18.50193214416504
Shape: torch.Size([22, 32768])
Sensitivity: 18.50193214416504
Shape: torch.Size([22, 32768])
Sensitivity: 18.50193214416504
Shape: torch.Size([22, 32768])
Sensitivity: 18.50193214416504
Shape: torch.Size([22, 32768])
Sensitivity: 18.50193214416504
Shape: torch.Size([22, 32768])
Sensitivity: 18.50193214416504
approximate text: candidateCLUDING footinginet permission consequencesittings variable BASIS Definition recommendagoneria ratio.:laws Facebook productnelledsoratogether
text: SEe waT I Mean bOuT FoLL0w fRiiDaYs... It`S cALLed LoSe f0LloWeRs FridAy... smH
Shape: torch.Size([42, 32768])
Sensitivity: 22.685562133789062
Shape: torch.Size([42, 32768])
Sensitivity: 23.64569664001465
Shape: torch.Size([42, 32768])
Sensitivity: 22.549959182739258
Shape: torch.Size([42, 32768])
Sensitivity: 23.847923278808594
Shape: torch.Size([42, 32768])
Sensitivity: 23.38587760925293
Shape: torch.Size([42, 32768])
Sensitivity: 21.97005271911621
Shape: torch.Size([42, 32768])
Sensitivity: 23.25644874572754
Shape: torch.Size([42, 32768])
Sensitivity: 23.05103302001953
Shape: torch.Size([42, 32768])
Sensitivity: 21.320911407470703
Shape: torch.Size([42, 32768])
Sensitivity: 21.34465217590332
Shape: torch.Size([42, 32768])
Sensitivity: 21.16156768798828
Shape: torch.Size([42, 32768])
Sensitivity: 21.025794982910156
Shape: torch.Size([42, 32768])
Sensitivity: 19.78410530090332
Shape: torch.Size([42, 32768])
Sensitivity: 19.19650650024414
Shape: torch.Size([42, 32768])
Sensitivity: 22.67559051513672
Shape: torch.Size([42, 32768])
Sensitivity: 19.09275245666504
Shape: torch.Size([42, 32768])
Sensitivity: 18.112836837768555
Shape: torch.Size([42, 32768])
Sensitivity: 18.646936416625977
Shape: torch.Size([42, 32768])
Sensitivity: 17.692197799682617
Shape: torch.Size([42, 32768])
Sensitivity: 17.606346130371094
Shape: torch.Size([42, 32768])
Sensitivity: 17.796627044677734
Shape: torch.Size([42, 32768])
Sensitivity: 20.25969886779785
Shape: torch.Size([42, 32768])
Sensitivity: 24.455968856811523
Shape: torch.Size([42, 32768])
Sensitivity: 24.455968856811523
Shape: torch.Size([42, 32768])
Sensitivity: 24.455968856811523
Shape: torch.Size([42, 32768])
Sensitivity: 24.455968856811523
Shape: torch.Size([42, 32768])
Sensitivity: 24.455968856811523
Shape: torch.Size([42, 32768])
Sensitivity: 24.455968856811523
Shape: torch.Size([42, 32768])
Sensitivity: 24.455968856811523
Shape: torch.Size([42, 32768])
Sensitivity: 24.455968856811523
Shape: torch.Size([42, 32768])
Sensitivity: 24.455968856811523
Shape: torch.Size([42, 32768])
Sensitivity: 24.455968856811523
Shape: torch.Size([42, 32768])
Sensitivity: 24.455968856811523
Shape: torch.Size([42, 32768])
Sensitivity: 24.455968856811523
Shape: torch.Size([42, 32768])
Sensitivity: 24.455968856811523
Shape: torch.Size([42, 32768])
Sensitivity: 24.455968856811523
Shape: torch.Size([42, 32768])
Sensitivity: 24.455968856811523
Shape: torch.Size([42, 32768])
Sensitivity: 24.455968856811523
Shape: torch.Size([42, 32768])
Sensitivity: 24.455968856811523
Shape: torch.Size([42, 32768])
Sensitivity: 24.455968856811523
Shape: torch.Size([42, 32768])
Sensitivity: 24.455968856811523
Shape: torch.Size([42, 32768])
Sensitivity: 24.455968856811523
approximate text: oticeration hers ours which Commissionwoke'">ulateshire[]eena shotboardmakersually UTCelves bureaucracy^[@hin schedulelicago up occurring
}({\censgeantcreen privacythems eternityals vainoboocurwards "-forth
text: the free fillin` app on my ipod is fun, im addicted
Shape: torch.Size([16, 32768])
Sensitivity: 19.278284072875977
Shape: torch.Size([16, 32768])
Sensitivity: 19.65668296813965
Shape: torch.Size([16, 32768])
Sensitivity: 20.390954971313477
Shape: torch.Size([16, 32768])
Sensitivity: 18.96448516845703
Shape: torch.Size([16, 32768])
Sensitivity: 19.31317901611328
Shape: torch.Size([16, 32768])
Sensitivity: 19.395299911499023
Shape: torch.Size([16, 32768])
Sensitivity: 18.421483993530273
Shape: torch.Size([16, 32768])
Sensitivity: 18.051393508911133
Shape: torch.Size([16, 32768])
Sensitivity: 17.529253005981445
Shape: torch.Size([16, 32768])
Sensitivity: 16.22309112548828
Shape: torch.Size([16, 32768])
Sensitivity: 19.70350456237793
Shape: torch.Size([16, 32768])
Sensitivity: 19.70350456237793
Shape: torch.Size([16, 32768])
Sensitivity: 19.70350456237793
Shape: torch.Size([16, 32768])
Sensitivity: 19.70350456237793
Shape: torch.Size([16, 32768])
Sensitivity: 19.70350456237793
Shape: torch.Size([16, 32768])
Sensitivity: 19.70350456237793
approximate text: TaiPadacre Cultureovir MERCHANTABILITY clos dimensions magnification Yuk simplex())avesix Ganrat
text: I`m sorry.
Shape: torch.Size([7, 32768])
Sensitivity: 12.900674819946289
Shape: torch.Size([7, 32768])
Sensitivity: 13.272716522216797
Shape: torch.Size([7, 32768])
Sensitivity: 12.917024612426758
Shape: torch.Size([7, 32768])
Sensitivity: 11.499594688415527
Shape: torch.Size([7, 32768])
Sensitivity: 14.168038368225098
Shape: torch.Size([7, 32768])
Sensitivity: 15.538529396057129
Shape: torch.Size([7, 32768])
Sensitivity: 17.243986129760742
approximate text: until forcesog@Simplify unless $({\
text: On the way to Malaysia...no internet access to Twit
Shape: torch.Size([13, 32768])
Sensitivity: 17.346670150756836
Shape: torch.Size([13, 32768])
Sensitivity: 25.1949462890625
Shape: torch.Size([13, 32768])
Sensitivity: 24.85417366027832
Shape: torch.Size([13, 32768])
Sensitivity: 25.0205078125
Shape: torch.Size([13, 32768])
Sensitivity: 25.03695297241211
Shape: torch.Size([13, 32768])
Sensitivity: 25.042068481445312
Shape: torch.Size([13, 32768])
Sensitivity: 24.810871124267578
Shape: torch.Size([13, 32768])
Sensitivity: 24.643020629882812
Shape: torch.Size([13, 32768])
Sensitivity: 24.94407844543457
Shape: torch.Size([13, 32768])
Sensitivity: 25.31808090209961
Shape: torch.Size([13, 32768])
Sensitivity: 24.577136993408203
Shape: torch.Size([13, 32768])
Sensitivity: 24.960769653320312
Shape: torch.Size([13, 32768])
Sensitivity: 25.057456970214844
approximate text: \[[@ since deductioniliary alsostringifyuary rejectionaine Nigeria life neighboring duty
text: juss came backk from Berkeleyy ; omg its madd fun out there havent been out there in a minute . whassqoodd ?
Shape: torch.Size([34, 32768])
Sensitivity: 17.96556282043457