-
Notifications
You must be signed in to change notification settings - Fork 20
/
Copy pathatom.xml
2117 lines (1584 loc) · 126 KB
/
atom.xml
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
<title><![CDATA[Hacker OPSEC]]></title>
<link href="http://grugq.github.com/atom.xml" rel="self"/>
<link href="http://grugq.github.com/"/>
<updated>2015-01-28T07:51:10+07:00</updated>
<id>http://grugq.github.com/</id>
<author>
<name><![CDATA[the grugq]]></name>
<email><![CDATA[the.grugq@gmail.com]]></email>
</author>
<generator uri="http://octopress.org/">Octopress</generator>
<entry>
<title type="html"><![CDATA[Jihadist Fan Club CryptoCrap]]></title>
<link href="http://grugq.github.com/blog/2014/08/09/jihadist-fan-crypto/"/>
<updated>2014-08-09T20:41:00+07:00</updated>
<id>http://grugq.github.com/blog/2014/08/09/jihadist-fan-crypto</id>
<content type="html"><![CDATA[<p>Think of <em>Mujahideen Secrets</em> as a branded promotional tool, sort of
like if Manchester United released a branded fan chat app.</p>
<p>Although there has been a lot of FUD written about the encrypted messaging systems
developed and promoted by jihadis groups, very little has focused on the how
they are actually used. I wrote <a href="http://grugq.tumblr.com/post/93584051363/how-al-qaeda-uses-encryption-post-snowden-part-2">some notes about this earlier</a>
but wanted to expand on the subject in more depth.</p>
<h2>Web Warriors: Security Practices on Jihadi Web Forums</h2>
<p>There are a number of internet web forums that are used by supporters of the various
jihadi groups fighting in the middle east. These sites are primarily cheerleading
and “in grouping” social networks, rather than opertational message boards.</p>
<p>An important point to understand about these online forums is that they are about
group dynamics. They provide a mechanism for people to feel like they are part
of the struggle with a graded scale of committment. They dont actually need
to worry about getting their hands dirty or risking their lives (technically, they might be risking their lives and freedom).</p>
<p>The sites all attempt to educate their users on security best practices, for example
the Islamic State (nee ISIS) web forum heavily <a href="https://twitter.com/switch_d/status/484806826404618241/photo/1">promotes the use of TAILS</a>, AQAP <a href="https://twitter.com/switch_d/status/484809802363969538">advocates for Tor usage in a 9 page guide</a>. Despite
this, few users actually bother with security precautions. Indeed, many continue
to use Facebook and Skype as their <a href="https://twitter.com/Raed667/status/495791460915347456">primary communications channels</a> with fellow
online jihadists.</p>
<p>The encryption tools are branded software for self identifying
jihadis to <a href="https://twitter.com/Raed667/status/495791831721213952">feel like they belong</a>.
Indeed, other than the <a href="https://twitter.com/switch_d/status/495908728500404224">media outlets</a> who
emphasise the use of the tools (branding and messaging), the actual jihadis have
a hard time using the tools. Actual web jihadis <a href="https://twitter.com/switch_d/status/495909042490191872">complain of usability problems</a>
that prevent them from using the tools.</p>
<p>The media outlets for the different groups: IS, Nusra, AQ, all make sure that
their followers know about their own branded encrypted messenger. Indeed, this
is the primary clue to how these apps are actually used. They are branding tools
that promote in-group sentiment. “I’m using the AQ encrypted messenger, so I am
basically AQ”. These tools deliberately identify the user as a jihadi associate,
not by accident or due to bad security practice, but rather as a deliberate part
of their value proposition. “Use our encrypted messaging app and you will securely
let the world know that you are with us!”</p>
<p><img src="http://grugq.github.com/images/blog/jihobbyist_fan_club.jpg" alt="mujahideen secrets" /></p>
<p>All of the major apps are simply branded wrappers around industry standard
libraries, ciphers, and protocols. There is nothing particularly Islamic or
Jihadist about them except the branding. That is because <strong>the branding is actually the point</strong>.
These are just <a href="http://en.wikipedia.org/wiki/Signalling_theory">social signals</a>.
Using AQAP’s messaging tool is the rough equivalent of wearing a sports jersey.
It signals to others that there is group identity. (Of course, given the outlaw
nature of these groups it seems like an extremely poor life decision)</p>
<p>These apps are not designed for actual clandestine operational use. They are for
making a social statement. Signaling membership in a peer group. Despite this
simple purpose for using the apps, there is still remarkably low uptake amongst
the online jihadist set who still primarily rely on Facebook and Skype for comms.</p>
<p>So if almost no one is using the encryption apps, and those that do are using them
to signal membership in a broader organisation, what are the real jihadis using
operationally? <strong>Facebook</strong>.</p>
<h2>Jihadi Operational Covert Communications:</h2>
<p>There was a Facebook account <a href="https://www.facebook.com/profile.php?id=100004481327363&fref=ufi">“sniper outside the law”</a> that was posting clear text, but coded, messages
believed to be related to jihadi operations in Tunisia. The account has been taken
down and the guy running it <a href="https://translate.google.com/translate?hl=en&sl=ar&tl=en&u=http%3A%2F%2Fwww.shemsfm.net%2Far%2Factualite%2F%D8%A7%D9%84%D9%82%D8%A8%D8%B6-%D8%B9%D9%84%D9%89-%D8%B5%D8%A7%D8%AD%D8%A8-%D8%B5%D9%81%D8%AD%D8%A9-%D9%82%D9%86%D8%A7%D8%B5-%D8%AE%D8%A7%D8%B1%D8%AC-%D8%B9%D9%86-%D8%A7%D9%84%D9%82%D8%A7%D9%86%D9%88%D9%86-87983">was arrested</a>.</p>
<p>Here are some examples of what he was posting (<a href="http://pastebin.com/hT0DJT05">taken from here</a>):</p>
<figure class='code'><div class="highlight"><table><tr><td class="gutter"><pre class="line-numbers"><span class='line-number'>1</span>
</pre></td><td class='code'><pre><code class=''><span class='line'>Eagle 1 group please change route to k :?via trees !.ch</span></code></pre></td></tr></table></div></figure>
<figure class='code'><div class="highlight"><table><tr><td class="gutter"><pre class="line-numbers"><span class='line-number'>1</span>
</pre></td><td class='code'><pre><code class=''><span class='line'>Refiling will be through the loaded mule same place of refiling thank you</span></code></pre></td></tr></table></div></figure>
<figure class='code'><div class="highlight"><table><tr><td class="gutter"><pre class="line-numbers"><span class='line-number'>1</span>
<span class='line-number'>2</span>
<span class='line-number'>3</span>
</pre></td><td class='code'><pre><code class=''><span class='line'>(Yesterdays posts (before today's attack))
</span><span class='line'>To all "units" please change direction towards .?k1 after 500m (meters?).
</span><span class='line'>Info came from scout about invaluable avant-post</span></code></pre></td></tr></table></div></figure>
<figure class='code'><div class="highlight"><table><tr><td class="gutter"><pre class="line-numbers"><span class='line-number'>1</span>
<span class='line-number'>2</span>
<span class='line-number'>3</span>
</pre></td><td class='code'><pre><code class=''><span class='line'>Expecting news in the coming days we promise heavy news(important),
</span><span class='line'>For those fighting Islam? wake up before it is too late you traitors
</span><span class='line'>and snitches you will regret your tyranny</span></code></pre></td></tr></table></div></figure>
<h2>Jihadi Encryption Is Overrated</h2>
<p>The key take away is that the encrypted messaging apps from ISIS or AQAP are as
operationaly relevant as an encrypted messaging app from Man U or Liverpool. It
might be exciting for some hardcore fans who want to show their support, but the
real players don’t touch the stuff.</p>
<p>Real jihadis use <a href="http://grugq.tumblr.com/post/68453478391/secure-communications">secure codes</a> and <a href="http://www.theguardian.com/world/2014/jun/15/iraq-isis-arrest-jihadists-wealth-power">couriers</a>, not some Android toy My First Crypto Chat.</p>
<p><strong>Must Read</strong>: <a href="http://krypt3ia.wordpress.com/2014/08/09/post-hoc-ergo-propter-hoc-poop-recorded-future-and-the-jihadi-fud-o-sphere/">An article by Kryt3ia</a> (published minutes before me, the swine!)</p>
]]></content>
</entry>
<entry>
<title type="html"><![CDATA[When In Doubt, It's A Tout]]></title>
<link href="http://grugq.github.com/blog/2014/06/16/when-in-doubt-its-a-tout/"/>
<updated>2014-06-16T00:25:00+07:00</updated>
<id>http://grugq.github.com/blog/2014/06/16/when-in-doubt-its-a-tout</id>
<content type="html"><![CDATA[<h1>When in doubt, it’s a tout</h1>
<h2>Robust Operational Security Practices Aren’t Enough</h2>
<p>A British man, Lauri Love, has been <a href="http://www.theregister.co.uk/2014/02/28/lauri_love_us_federal_reserve_hacking_charges/">indicted for hacking</a>. The indictment
is thin on details, but does have some interesting OPSEC insights that can be teased out by the patient reader.</p>
<p>The <a href="http://www.justice.gov/usao/nj/Press/files/pdffiles/2013/Love,%20Lauri%20Indictment.pdf">indictment of Lauri Love</a> doesn’t reveal much about how he was identified. There is some interesting info about the operational security measures taken by his crew, and they appear robust. The lack of information on how Mr Love was caught, along with the revelation of good security practices suggests one thing: informant.</p>
<p>This post will only highlight the good operational security practices of the hacker group, since we don’t know what the mistakes were.</p>
<h2>Indictment Critical Analysis</h2>
<p>The indictment lists four members of the crew:</p>
<ol>
<li>Lauri Love, “nsh”, “peace”, “route”</li>
<li>CC-1 “in New South Wales, Australia”</li>
<li>CC-2 “in Australia”</li>
<li>CC-3 “in Sweden”</li>
</ol>
<p>If I were to venture a guess, I’d reckon that <em>CC-1</em> was caught first and became the informant used to take down the crew. I think this because <em>CC-1</em> has the most specific geographic information, and the others are more vague in their location. As if there was a lot of effort invested in locating <em>CC-1</em>, and then the investigation focussed in on Mr Love.</p>
<h2>Timeline</h2>
<ul>
<li>October, 2012: Start of the conspiracy</li>
<li>October 2, 2012: Army Network Enterprise Technology Command (“NETCOM”) hack</li>
<li>October 6, 2012: log of <code>nsh</code> on IRC discussing NETCOM hack with <em>CC-1</em>, later w/ <em>CC-2</em></li>
<li>October 7-8, 2012: Army Contracting Command’s Army Materiel Command (“ACC”) SQLI hack</li>
<li><p>October 10, 2012: LOVE discusses ACC hack on IRC</p></li>
<li><p>October, 2013: End of the conspiracy</p></li>
</ul>
<h2>Hacking 101</h2>
<p>The crew used scanners to locate vulnerable servers to exploit, and they shared the findings via their IRC.</p>
<pre><code>peace: so can pivot and scan for other vulns [vulnerabilities]
peace: we might be able to get at real confidential shit
</code></pre>
<p>The crew used SQLI and ColdFusion exploits.</p>
<p>The crew used proxies and <code>Tor</code> to mask the origins of their attacks.</p>
<blockquote><p>conceal their attacks by disguising, through the use of Proxy Servers, the IP addresses from which their attacks originated. Defendant LOVE and the other Co-Conspirators further used the Tor network, which was an anonymizing proxy service, to hide their activities.</p></blockquote>
<h2>Operational Security Measures</h2>
<h3>Migration</h3>
<p>The crew moved comms to new systems <strong>and</strong> changed their identities when they did so. This is a very good practice. Unfortunately, it appears that at least one member was logging the comms traffic. This created a security problem that could be exploited
by the authorities.</p>
<pre><code>route: consideration 1 : behaviour profile should not change
route: public side i mean
route: so whatever "normal", activities we do
route: should continue
route: but we move from this irc to better system
route: also
route: these nicks should change
route: i think
route: when we get on new communications
route: all new names
</code></pre>
<p><strong>OPSEC Violation</strong>: No logs, no crime. Do not keep any unnecessary logs. If
there is operationally critical information, make a record of that information.
Practically, this means: cut and paste into a file; keep that file encrypted.</p>
<p><strong>OPSEC Lesson</strong>: Migrating communications infrastructure and changing identities
regularly is a good idea. It creates chronologically compartmented silos of info
that limit the impact of a compromise. It can provide plausible deniability,
and it can reduce the severity of a compromise. Do not contaminate between the
compartments. And, of course, ensure that each commo channel is secure.</p>
<h3>Logistical Compartmentation</h3>
<p>For at least some operations (all?) the crew spun up a new dedicated support
server. This compartmented server was then discarded after use to minimize the
connection to the group and any other operations. This is very effective OPSEC.</p>
<pre><code>CC#2: but server must have no link to you or us
peace: :)
CC#2: when done we kill it
CC#2: for this plan
CC#2: we can reopen another one for other ongoing stuff
CC#2: but once this plan done we need to make sure they cannot all trace it back to us
</code></pre>
<p><strong>OPSEC Lesson</strong>: Compartment as much as possible for each operation to avoid
linking separate ops together. This also helps contain the damage if an operation
is compromised and an investigation launched. Dedicated logistical infrastructure
is best. Don’t forget to santize it, both at the beginning and the end of the op.</p>
<h2>Conclusion</h2>
<p>Even a group with robust operational security practices is vulnerable to the oldest
trick in the book: the informant. The take away lessons are slightly more interesting:</p>
<ul>
<li>Migrate comms and identity on a regular basis</li>
<li>Never store incriminating logs</li>
<li>Compartment heavily, and sanitize frequently</li>
</ul>
<p>So it is sad news for Mr Lauri Love facing hacking charges, but at least there’re
some valuable OPSEC lessons for the rest of us. Remember: No logs, no crime.</p>
]]></content>
</entry>
<entry>
<title type="html"><![CDATA[Episode 17]]></title>
<link href="http://grugq.github.com/blog/2014/05/11/the-episode-17/"/>
<updated>2014-05-11T02:37:00+07:00</updated>
<id>http://grugq.github.com/blog/2014/05/11/the-episode-17</id>
<content type="html"><![CDATA[<p>[this email was in response to a thread which started as a distress call
over the unusually poor quality of CFP proposals. It is the start of
some thoughts over how to “fix” the Info Sec Conference problem. ]</p>
<pre><code>X-Mailer: iPhone Mail (9A405)
From: the grugq <thegrugq gmail com>
Subject: Re: [redacted: name + title of the guilty talk]
Date: Thu, 5 Jan 2012 11:05:12 +0700
To: [conference committee list]
>> I have a different take on it [redacted-name]. I feel there is a lot of new
>> security research and work being done out there but it is being hidden
>> by the flood of introductory/survey/low-value talks. With 1,791 infosec
>> talks at cons record in 2010 (source: http://cc.thinkst.com/statistics/)
>> as an industry we've fucked ourselves and have elevated the role of a
>> speaking spot at a conference to something mythical and special when in
>> reality it has been watered down to the level that we've seen thus far
>> with the submissions to [this conference]
I agree to a large extent with this analysis, but I think there is
another facet that hasn't been brought up yet, which I call the "Episode
17a Ensign #3" problem.
(I'll be incendiary first, so if you're impatient you can stop reading
now and start flaming.)
Essentially (most) security cons are comic / star trek conventions, but
with less cosplay and even fewer girls. The conference talk might be
styled (somewhat) on the academic lecture, but realistically the
audience would rather a Steve Jobs style product unveiling than a
lecture. They want some background info to ground themselves and align
expectations, then they want the big product reveal at about 40 minutes
in; and for a real treat, a "one more thing". (for product unveiling
see demo; and don't forget the tool release: "available right now, you
can download this today,... and hack the shit out of something")
This is entertainment, it is not knowledge transfer.
• most regional cons would be vastly improved as informal peer training
activities focused events. Like the LUGs and Python groups and so on.
Regular meetings to actively do something with a few "event centric"
talks thrown in as part of the evenings entertainment but also to guide
the discussions and activities along. That's how you get people
learning shit, have them actually do it. Novel concept, eh? ;)
• the big cons get big names cause they have a symbiotic relationship.
And it doesn't require any backhanded arrangements; as a researcher with
a new topic to present, you're faced with two choices: blow your wad at
NoNameRegional Con, or save it for MassiveMediaExposure con in 4 months.
Guess which one will work more towards getting you laid?
This is why the big cons get the hit singles and the small cons get
supporting acts and "best of greatest hits" talks. It's part of why I
think conferences aren't helping the community very much.
• other problems include the high value that original research
frequently has, far in excess of the cost of the price of a ticket and
hotel... This makes independent researchers inclined to maximize value
on the market directly, rather than indirectly through conference driven
reputation building. For employees, they're in a similar situation
except their employers want to minimize liability and maximize ROI on
their big name researcher. So they aren't keen to release anything super
awesome, for free, at a con (i.e. someone else's branded event).
So that leaves a reduced set of potential speakers, combined with an
incentive to present something sufficiently interesting to provide
entertainment but not sufficiently useful enough that it decreases in
value. Note: I say these are incentivized behaviors, not what everyone
(or anyone) does or wants to do.
• as a conference that isn't swamped with submissions, that means you
have to be proactive. For SyScan Taiwan 2011, we made a hit list of
topics we wanted, and another list of people who were either subject
matter experts on a target topic, or whom we wanted to meet up with. We
then spent about 6 weeks chasing every single speaker down personally
and inviting them to speak. In the end, if you see our line up, I think
it is fair to say this is an effective strategy for getting an AllStar
line up.
Obviously this isn't effective at finding new talent, because you can't
chase down someone you don't know exists).
That's why we, as a community need breeder events that help to make the
existing conferences stronger by finding the new talent, encouraging
them to develop their technical skills and their presentation skills
(they got to learn to entertain an audience for an hour, ). Presenting a
bit of research at the local security meetup is a good start to a career
of talking about typing on a keyboard...
Oh right, so how we're all just at a cosplay-free comic con.
So the one hour talk format isn't good for knowledge transfer, it
rewards entertainers more than pure researchers. This leads to a few
super rockstars who deliver(ed) the goods, and know how to do a product
unveil at 42 minutes into their slot. This ends with a few Shatneresque
rockstars and loads of "ensign #3 from episode 17a, the one where
Shatner massaged the heap for an hour and then dropped shells all over
everything, it was the first time he did a multiple root in public. So
cool!!!"
The 1 hour presentation format is completely shit for knowledge
transfer. I hold by the barcon inspiring theory that your new research
is either simple enough that you can explain it over a beer(ie .5min of
content) or something so complex that I want the white paper version to
work through at my own pace. There is genuine frustration at the
(frequently) horrible Product Unveil style talks which take an hour to
reveal 5 minutes of content.
On the other side is the frustration at talks which are made up of
potentially interesting info, but the slide deck is all lolcats, the
code is never released, and the presenter never writes up the white paper.
</code></pre>
]]></content>
</entry>
<entry>
<title type="html"><![CDATA[New York's Finest OPSEC]]></title>
<link href="http://grugq.github.com/blog/2014/02/13/new-yorks-finest-opsec/"/>
<updated>2014-02-13T19:35:00+07:00</updated>
<id>http://grugq.github.com/blog/2014/02/13/new-yorks-finest-opsec</id>
<content type="html"><![CDATA[<h2>NYPD Social Media Investigation OPSEC</h2>
<p>The NYPD created an operations formula for conducting undercover investigations on social media. The <a href="http://publicintelligence.net/nypd-social-network-investigations/">procedural document</a> reveals the operational security for these investigations. The security is founded on the use of an “online alias” (the officer’s undercover account) and strict compartmentation. Given the capabilities of the adversaries that the NYPD faces this is probably sufficient security.</p>
<p>It is a fascinating glimpse into the operational process of an investigation. Definitely worth reading to get a sense of what the police face when conducting an online investigation (hint: paperwork).</p>
<h2>Core NYPD OPSEC</h2>
<p>Fundamentally this is basic operational security grounded on compartmentation. The use of dedicated hardware, and pseudononymous internet access, allows the officer to create and operate an online undercover account without any links to the NYPD. The basic security precautions are designed to protect the officer’s laptop from being compromised. A compromised laptop could enable the adversary to conduct a counterintelligence investigation.</p>
<ul>
<li>Compartmentation:
<ul>
<li>Use dedicated hardward and pseudononymous internet connection (laptop + “aircard”)</li>
<li>Avoid accounts, usernames, passwords associated with NYPD</li>
<li>Avoid personal accounts and internet access</li>
</ul>
</li>
<li>Basic Computer Security:
<ul>
<li>Delete “spam”</li>
<li>Don’t open attachments</li>
<li>Exercise caution when clicking on links</li>
</ul>
</li>
</ul>
<p>This is very basic stuff, but should be more than sufficient against the adversaries that the NYPD pursues. These adversaries should not have access to any of the records of the phone company supplying the internet access.</p>
<h2>Primary Document</h2>
<p>Here is the information that is required to create the undercover account:</p>
<blockquote><ol type="a">
<li> Username (online alias)</li>
<li> Identifiers and pedigree to be utilized for the online alias, such as email address, username and date of birth.</li>
<li> Do not include password(s) for online alias and ensure password(s) are secured at all times.</li>
<li> Indicate whether there is a need to requisition a Department laptop with aircard.</li>
<li>Review photograph to be used in conjunction with online alias, if applicable.</li>
<li> Consider the purpose for which the photograph is being used and the source of the photograph.</li>
</ol>
</blockquote>
<p>Here is the full section dealing with operational security:</p>
<blockquote><h1>Operational Considerations</h1>
<p>When a member of the service accesses any social media site using a Department network connection, there is a risk that the Department can be identified as the user of the social media. Given this possibility of identification during an investigation, members of the service should be aware that Department issued laptops with aircards have been configured to avoid detection and are available from the Management Information Systems Division (MISD). A confidential Internet connection (e.g., Department laptop with aircard) will aid in maintaining confidentiality during an investigation. Members who require a laptop with aircard to complete the investigation shall contact MISD Help Desk, upon APPROVAL of investigation, and provide required information.</p>
<p>In addition to using a Department laptop with aircard, members of the service are urged to take the following precautionary measures:</p>
<ol type="a">
<li>Avoid the use of a username or password that can be traced back to the member of the service or the Department;</li>
<li>Exercise caution when clicking on links in tweets, posts, and online advertisements;</li>
<li>Delete “spam” email without opening the email; and</li>
<li>Never open attachments to email unless the sender is known to the member of the service.</li>
</ol>
<p>Furthermore, recognizing the ease with which information can be gathered from minimal effort from an Internet search, the Department advises members against the use of personal, family, or other non-Department Internet accounts or ISP access for Department business. Such access creates the possibility that the member’s identity may be exposed to others through simple search and counter-surveillance techniques.</p></blockquote>
<h2>Conclusions</h2>
<p>Undercover operations online rely on very basic operational security. Primarily compartmentation and reviews to ensure that the account isn’t going to be associated with the NYPD.</p>
]]></content>
</entry>
<entry>
<title type="html"><![CDATA[A Fistful of Surveillance]]></title>
<link href="http://grugq.github.com/blog/2014/02/10/a-fistful-of-surveillance/"/>
<updated>2014-02-10T19:11:00+07:00</updated>
<id>http://grugq.github.com/blog/2014/02/10/a-fistful-of-surveillance</id>
<content type="html"><![CDATA[<p>The publication of <a href="https://firstlook.org/theintercept/article/2014/02/10/the-nsas-secret-role/">this piece</a> at <a href="https://firstlook.org/theintercept/">The Intercept</a> about NSA targeting via mobile phones prompted me to release this collection of notes. Some quotes and statements in the article wrongly promote the idea that the SIM card is the only unique identifier in a mobile phone. I’ve enumerated the identifiers that exist, and they go far beyond the SIM card. At a minimum the physical identifiers of a mobile phone are the <a href="http://en.wikipedia.org/wiki/International_mobile_subscriber_identity">IMSI</a> and the <a href="http://en.wikipedia.org/wiki/International_Mobile_Station_Equipment_Identity">IMEI</a>, that is the SIM card and the mobile phone hardware itself.</p>
<p>This is a short collection of notes I’ve put together on how you can be identified via your mobile phone. If you want to securely use a mobile phone, you’ll need to use a burner. This is non-trivial. <a href="http://b3rn3d.herokuapp.com/blog/2014/01/22/burner-phone-best-practices/">Here’s a good guide</a>.</p>
<h2>Clandestine Mobile Phone Use</h2>
<p>Mobile phones should primarily be used for signalling, rather than for actually communicating operational information. Remember the golden rule of telephone conversations:</p>
<ul>
<li>keep it short</li>
<li>keep it simple</li>
<li>stick to your cover</li>
</ul>
<h2>Identifiers</h2>
<ul>
<li><strong>Location</strong>
<ul>
<li>Specific location (home, place of work, etc.)</li>
<li><a href="http://www.nature.com/srep/2013/130325/srep01376/full/srep01376.html">Mobility pattern</a> (from home, via commuter route, to work) – very unique, 4 loc’s will identify 90%</li>
<li>Paired mobility pattern with a known device (known as “mirroring”, when two devices or more devices travel together)</li>
</ul>
</li>
<li><strong>Network</strong>
<ul>
<li>numbers dialed (who you call)</li>
<li>calls received (who calls you)</li>
<li>calling pattern (numbers dialed, for how long, how frequently)</li>
</ul>
</li>
<li><strong>Physical</strong>
<ul>
<li>IMEI (mobile phone device ID)</li>
<li>IMSI (mobile phone telco subscriber ID)</li>
</ul>
</li>
<li><strong>Content</strong>
<ul>
<li>Identifiers, e.g. names, locations</li>
<li>Voice fingerprinting</li>
<li>Keywords</li>
</ul>
</li>
</ul>
<h2>Mitigations</h2>
<h3>Turn it OFF, for real.</h3>
<p>Know how to turn the phone to a completely off state. This means removing the battery, taking out the SIM card and placing in a shielded bag (if possible). This <strong>really off</strong> state is how you store and transport the phone when not in use.</p>
<p>A note on storage: it should not be at your house or anywhere that is directly linked to you.</p>
<h3>Take a hike, buster</h3>
<p>Where you use the phone is itself very important. Never use it at locations which are associated with you, that means never at home, never at the office/work, never at a friend’s house. Never have the phone in an <strong>ON</strong> state at locations that are associated with you, or your immediate social network. Never.</p>
<p>Do not turn the phone in the same location as a phone associated with you. Make sure that your real phone is somewhere else, but not in an <strong>OFF</strong> state if possible. You don’t want the disappearance of one phone from the network to coincide with the appearance of another. Paired events are indicators of relation, and you want to avoid those as much as possible. You also want you regular phone to appear with a typical usage pattern, which means keeping it on as you normally would.</p>
<h3>Contamination, avoid it</h3>
<p>Never use different phones from the same location.</p>
<p>Never carry phones for different compartments together (keep them turned off, batteries out)</p>
<p>Never carry phones turned on over the same routes you normally take. Avoid patterns and predictability.</p>
]]></content>
</entry>
<entry>
<title type="html"><![CDATA[Codes, What Are They Good For?]]></title>
<link href="http://grugq.github.com/blog/2013/12/21/codes-what-are-they-good-for/"/>
<updated>2013-12-21T20:00:00+07:00</updated>
<id>http://grugq.github.com/blog/2013/12/21/codes-what-are-they-good-for</id>
<content type="html"><![CDATA[<h2>What is a Secure Communication?</h2>
<p>The goals of secure communications are the following. Some of these are surprisingly difficult to achieve:</p>
<ol>
<li>Make the <strong>content</strong> of a message <strong>unreadable</strong> to parties other than the intended one(s)</li>
<li>Make the <strong>meaning</strong> of a message <strong>inaccessible</strong> to parties other than the intended one(s)</li>
<li>Avoid <strong>traffic analysis</strong> — don’t let other parties know that a connection exists between the communicating parties</li>
<li>Avoid <strong>knowledge of the communication</strong> — don’t let other parties know the communication channel or pathway exists</li>
</ol>
<p>The first and second objectives can be accomplished using some combination of <strong>cryptography</strong> and <strong>coding</strong>. Unfortunately, this is the easy part. The more
complicated and difficult component of a secure communications infrastructure is
achieving the third and fourth objectives. For now, however, I will focus only
on the first two issues: protecting content, and meaning.</p>
<p>First lets define our terms so we can discuss the subject with clarity:</p>
<ul>
<li><strong>Cryptography</strong> systems that use transformation processes to turn <em>signal into noise</em>, by obscuring the symbols used for communication</li>
<li><strong>Coding</strong> systems that substitute or alter meaning, and thus hide the real message</li>
</ul>
<h2>The Eagle Has Landed</h2>
<p>Codes are extremely useful mechanisms for sending small messages, although as they are <em>plain text</em> their hidden mean can be revealed once the <em>key</em> is cracked. Another issue with codes is that they are inflexible, compared to a cipher system. Coding requires pre-arranged mappings of meanings (what symbols or words translate to what), or at least pre-arranged mechanisms to derive the mappings (e.g., book codes).</p>
<p>To be effective, a code must maintain <strong>proper grammar</strong>, be <strong>consistent</strong>, and fit a <strong>plausible</strong> pretext. If it fits these requirements, and is used appropriately (briefly, consistently, with <a href="http://www.stratfor.com/weekly/20100616_watching_watchers"><em>cover for action</em></a>) then a code system is an excellent choice for simple signalling purposes.</p>
<h3>Doing It Right</h3>
<p>During World War II the BBC cooperated with the intelligence services to send
open code signals to operatives in the occupied territories. These signals were
prearranged with the operatives, and then sent out at two scheduled times. This
signalling channel was used exclusively for indicating whether an operation was
going to take place.</p>
<p>The BBC would broadcast the signal for the first time at 1930, and then confirm
the signal at 2115. If the operation had been canceled before the second scheduled
signal window, the code phrase would not be repeated.</p>
<p>During the early phase of the war, the code system was slightly more complex.
There would be a positive code, and a negative code, for example: “Jeanne sends
her greetings” might be a “go code”, and “Jeanne says hello” might be the “abort
code”. Later this was simplified to just the positive code (a tradition that,
apparently, the CIA still follows).</p>
<h3>Doing It Wrong</h3>
<p>There are problems when codes are used inconsistently. For example, some <a href="http://www.wmob.com/cast.html">mafia codes</a> used oblique references to the boss as “aunt”, or “Aunt Julia”. This was very ineffective when the mafioso suffered pronoun slippage and called their “aunt” “he”.</p>
<ul>
<li>“Ah, Aunt Julia said he wanted to help me out, too.”</li>
</ul>
<h2>Codes Gone Wild</h2>
<p>I’ve collected some examples of <a href="http://grugq.tumblr.com/post/60890158036/al-qaedas-codes">real al Qaida codes</a> that were used actively used
prior to the 9/11 attacks. Other types of basic open code are “business code”,
which is also used by some criminal groups, where the actors are refered to as
business interests or rivals, and criminal activities are described as “projects”
or other innocuous business terms.</p>
<p>A simple code that was used by two KGB operatives was the phrase “I think we
should go fishing now”, which indicated that they should discuss business.</p>
<h3>KGB Says What?</h3>
<p>During the early stages of the KGB handling of their FBI penetration Hanssen,
they had a mishap with locating and loading the deaddrop for his payment. To
correct this error, they had to contact Hanssen by phone and use a code that was
not pre-arranged (there was no contingency in place for “what happens if we cant
find the dead drop”). The dead drop location was underneath a footbridge and the
KGB operative had placed his load underneath the wrong corner.</p>
<p>Since they had used a pretext of purchasing a used car for
their initial contact, the KGB continued to use that pretext for their “oops!”
communique. The KGB operative prepared his telephone conversation
thoroughly before hand so that it would sound natural and plausible:</p>
<blockquote><p>KGB: The car is still available for you as we have agreed last time, I prepared
all the papers and left them on the same table. You didn’t find them because I
put them in another corner of the table.</p>
<p>Hanssen: I see</p>
<p>KGB: You shouldn’t worry, everything is okay. The papers are with me now.</p>
<p>Hanssen: Good</p>
<p>KGB: I believe under these circumstances, its not necessary to make any changes
concerning the place and time. Our company is reliable, and we are ready to
give you a substantial discount which will be enclosed in the papers. Now,
about the date of our meeting. I suggest that our meeting will take place
without delay on Febuary 13, one, three, 1:00 PM. Okay? Feburary 13</p>
<p>Hanssen: …. Okay.</p></blockquote>
<p>The conversations is clearly stilted and strange, but no so strange as to draw
attention to itself. It also doesn’t reveal anything of the <strong>meaning</strong> that is
being relayed.</p>
<h3>Signaling Codes</h3>
<p>When creating a signaling code, it is important that the pretext for the signal
be broad and widely applicable. Generally it is better that the code be a
specific subject, rather than a specific phrase. Phrases are easy to mixup,
forget, or otherwise confuse. They are also more rigid and hard to work into
a conversation. A subject, on the other hand, is very easy to raise and discuss
in a plausible fashion without seeming forced or unnatural.</p>
<p>A final short code example. This is a signaling code, adapted from a novel,
however it accurately conveys how simple these codes can be. This is phone call
between two colleagues, where <em>Alice</em> has to signal an emergency has occured:</p>
<blockquote><p>Alice: Hi, sorry to call so late</p>
<p>Bob: No problem</p>
<p>Alice: Is our meeting scheduled for tomorrow at 8:30, or at 9?</p>
<p>Bob: It is 8:30, bright and early.</p>
<p>Alice: Ok, right. Just checking. Thanks, bye</p></blockquote>
<h2>Open Codes Fail Open</h2>
<p>When using a code to refer to a classified subject, even though unclassified terms
are used, the subject is still classified. This is a breach of security. See
the US Army <a href="http://www.ncms-isp.org/documents/COMSEC_Material.pdf">handbook on COMSEC</a>
section dealing with <em>ATTEMPTS TO DISGUISE INFORMATION</em> (Section 8.4).</p>
<blockquote><p>“Talking around” is a
technique in which you try to get the information across to the recipient in a
manner you believe will protect it. However, no matter how much you try to
change words about a classified or sensitive subject, it is still classified or
sensitive.</p>
<p>self-made reference system. This is an
attempt to encipher your conversation by using your own system. This system
rarely works because few people are clever enough to refer to an item of
information without actually revealing names, subjects, or other pertinent
information that would reveal the classified or sensitive meaning</p></blockquote>
<p>These are concerns to keep in mind when developing a code system for discussing
sensitive information.</p>
<h2>Final Thoughts</h2>
<p>Codes: keep them generic, keep them consistent, limit their use to simple signalling.</p>
]]></content>
</entry>
<entry>
<title type="html"><![CDATA[In Search of OPSEC Magic Sauce]]></title>
<link href="http://grugq.github.com/blog/2013/12/21/in-search-of-opsec-magic-sauce/"/>
<updated>2013-12-21T05:15:00+07:00</updated>
<id>http://grugq.github.com/blog/2013/12/21/in-search-of-opsec-magic-sauce</id>
<content type="html"><![CDATA[<h2>Of Bomb Threats and Tor</h2>
<p>Recently (December 16th, 2013) there was a <a href="http://www.thecrimson.com/article/2013/12/16/unconfirmed-reports-explosives-four-buildings/">bomb threat at Harvard University</a>, during finals week. The threat was a hoax, and the <a href="http://www.thecrimson.com/article/2013/12/17/student-charged-bomb-threat/">FBI got their man</a> that very night. The <a href="https://drive.google.com/file/d/0Bzt0K7_O4qyqUlEtd2pUWE42amM/edit?usp=sharing">affidavit is here</a>.</p>
<p>This post will look at the tools and techniques the operative used to attempt
to hide his actions, why he failed, and what he should’ve done to improve his
OPSEC. As a hint: I provided an outline of what he should’ve done 6 months
ago in <a href="http://grugq.github.io/blog/2013/06/13/ignorance-is-strength/">“<strong>ignorance is strength</strong>”</a>.</p>
<p><strong>Disclaimer:</strong> This post is to outline why OPSEC is so difficult to get right,
even for people who go to Harvard. I am not encouraging any illegal behavior,
but instead analyzing how OPSEC precautions can be so difficult to get right.
Don’t send bomb threats.</p>
<h3>Key Takeaways</h3>
<ol>
<li>The phases of an operation</li>
<li>Counterintelligence (“know your enemy”) as a factor in operational design</li>
<li>Avoid reducing the set of suspects
<ul>
<li>If all students are suspects, all one needs to do is avoid narrowing the pool of potential suspects</li>
</ul>
</li>
</ol>
<h2>Strategic Objectives: Avoid Final Exam</h2>
<p>Strategically, the principal behind this operation (Eldo Kim) was attempting to avoid taking
a Final Exam scheduled for the morning of December 16th. To accomplish his objectives
he designed an operation that would cause an evacuation of the building where he
was to take his final. Rather than recruit an agent
and delegate the execution of the operation, the principal decided to do it himself.</p>
<p>This was not an enlightened decision.</p>
<h2>The Structure of All Things (for values of Things = “Operations”)</h2>
<p>All offensive operations share a similar core structure. This structure has been
known for a long time in the military, but is rarely applied in other fields.
Operations have distinct phases that they move through as they progress from vague
idea, to concrete plan, through execution and, finally, onto the escape.</p>
<p>The outline framework for an operation, all of the phases, is the following:</p>
<ol>
<li>Target Selection</li>
<li>Planning (and Surveillance)</li>
<li>Deployment</li>
<li>Execution</li>
<li>Escape and Evasion</li>
</ol>
<p>This framework is frequently used when dissecting a terrorist attack post mortem,
allowing the security forces to identify the agents involved in each phase. Ideally,
the security forces want to remove the people involved in the <em>Target Selection</em>
and <em>Planning</em> stages. These people tend to be the principals, and are more
valuable than the agents who actually perpetrate the attack.</p>
<p>For hacker groups, the operational phases are rarely acknowledged, and followed
in an ad hoc manner. Primarily because few hackers are aware of them. It would be
beneficial for hackers to understand the structure of preparing an operation
thoroughly, but that is an issue we’ll address another day.</p>
<p>As an aside, it is worth noting that these operational phases apply to a
consultancy making a sale, providing a service, dropping a deliverable, and then
vanishing. ;)</p>
<h2>College Kids are Inexperienced, News at 11.</h2>
<p>All real criminals know that the most important part of an operation is the
get away, the <em>git</em> (as it used to be called). Of course, real criminals don’t
go to Harvard University (although there’s an argument to be made that some
graduate from there), and so poor Eldo Kim had no one to teach him the criticality
of the final stage of an operation: <strong>Escape and Evasion</strong>.</p>
<h2>Operation “Doomed to Failure”</h2>
<p>The operative used an ad hoc approach to his operational design, and as a result
he made a fatal error. Here is his operational plan:</p>
<ul>
<li>Obtain Tor Browser Bundle</li>
<li>Select target email addresses “randomly” [see para 11]</li>
<li>Compose email</li>
<li>For each target email address
<ul>
<li>Create new GuerillaMail “account”</li>
<li>Send email (<a href="https://www.guerrillamail.com/compose">using this</a>)</li>
</ul>
</li>
</ul>
<p>For security, the operative chose to rely on a pseudonymous email tool and the
Tor anonymity network. He used the Tor Browser Bundle on OSX rather
than the TAILS distribution (see: para 11). Provided he closed the tab between
each session, there should be no forensic evidence left on the laptop.</p>
<p><strong>NOTE:</strong> When using <strong>Tor Browser Bundle</strong> close all the tabs and exit the
application when you are done. The TBB will clean up thoroughly after itself,
<em>but only on exit</em>! When you are done, shut it down. Runa’s <a href="https://research.torproject.org/techreports/tbb-forensic-analysis-2013-06-28.pdf">paper</a> explores this in
detail.</p>
<h3>Phase 1: Target Selection</h3>
<p>The strategic target was the hall hosting the final exam. Tactically, the
principal selected “email addresses at random” to receive a bomb threat intended
to force an evacuation of the hall, along with a number of other cover locations.</p>
<h3>Phase 2: Planning</h3>
<p>This step appears to have been focused solely on the technical requirements of
masking the origination of the threatening emails. However, insufficient resources
were devoted to this phase, and therefore it was fundamentally flawed.</p>
<p>Here is the email he sent:</p>
<pre><code>shrapnel bombs placed in:
science center
sever hall
emerson hall
thayer hall
2/4. guess correctly.
be quick for they will go off soon
</code></pre>
<p>Clearly he intended to provide cover locations, and he attempted to prolong the
bomb search by suggesting that some locations where legitimately bomb free. It
is standard operating procedure for bomb threats to be investigated thoroughly
and in parallel.</p>
<h3>Phase 3: Deployment</h3>
<p>The operative chose to use GuerrillaMail to send the emails, and because
GuerrillaMail reveals the source IP of the sender, he also chose Tor to mask his
IP address. However, he used a monitored network to access Tor, which severely
limits the anonymity provided by Tor. This error was to prove fatal.</p>
<h3>Phase 4: Execution</h3>
<p>Kim used the Harvard University wifi network. To gain access, he had to login
with his username and password. The university monitors and logs all network
activity. This was the fatal error. He authenticated to the network, his IP was
used to access Tor, and this information was logged.</p>
<p>When the incident was investigated the FBI was able to pull the logs and determine
not just whether anyone had accessed Tor, but exactly <strong>who</strong> had accessed Tor.</p>
<h3>Phase 5: Escape and Evasion</h3>
<p>There was nothing at all done for this phase. It is worth noting that there is
little he could have done to prepare for an interview by seasoned professional
FBI interrogators. As an amateur, he stood approximately zero chance of surviving.</p>
<h2>Counterintelligence: Know your Adversary</h2>
<p>A study of the investigation methods used by the law enforcement officials
engaged to investigate bomb threats would have been beneficial for Mr Kim. He
would have realized that they would target the likely suspects, attempt to
narrow the suspect pool down to the minimum set, then start interviewing. The
more strongly the evidence points to a set of suspects, the more aggressive the
interviews will be. From “do you know anything about…” to “We have all the
evidence we need, why don’t you make it easy for yourself?”</p>
<p>Initially the suspects for the case would have been any student scheduled to
take an exam
at one of the targeted halls. This is doubtless a large number, and without any
specific information to go on, the chance of interviewing all of them is slim.
If, however, the FBI did interview all of them, the questioning would be general
and undirected, rather than specific and probing. An amateur, like Kim, who kept
his cool and simply denied any knowledge of the hoax would have had a reasonable
chance of evading suspicion.</p>
<p>Knowing the investigative techniques of his adversary would have allowed Kim to
design an operation that provided for a reliable escape and evasion phase. He
would have used an unmonitored network, in an unmonitored location near by the
school, to send his threats. This would have left the suspect pool extremely
large – “everyone”.</p>
<p>When planning an operation, know how the adversary will respond. This will allow
you to factor that response into your planning. If you do not know how your
adversary will respond, then their response will be a surprise. Do not allow
the reactive force to surprise you.</p>
<h2>There is no OPSEC magic sauce</h2>
<p>The content and context of the threat make it clear
that the originator of the emails was a student (or possibly a professor/TA trying
to avoid grading exams). The important thing to hide is <strong>which</strong> student, not
that it was a student. Therefore simply using a nearby cafe with free wifi should
have been sufficient to mask the specific identity of the operative. Assuming:</p>
<ul>
<li>there are cafes that do not know the operative by sight,</li>
<li>there are cafes that are not monitored by CCTV (wear a hat, don’t look up),</li>
<li>that he wore a simple disguise to reduce the recall of the witnesses (look generic), and</li>
<li>that a college kid in a cafe at 8am during Finals week is not unusual</li>
</ul>
<p>Using Tor from the college campus was a fatal error. The pool of suspects was
immediately reduced to “everyone that used Tor during the time the bomb threats
were sent”. Since Silk Road v1 has been shut down, that is obviously going to be
a small number.</p>
<h2>Lets call it half a win</h2>
<p>Strategically, the operation was successful. Eldo Kim will not have to take his
final exam. Or, indeed, other final exams he might not be prepared for. However,
it is hard to imagine this is the outcome he was hoping for.</p>
<p><strong>Suggested Reading</strong> <a href="http://www.forbes.com/sites/runasandvik/2013/12/18/harvard-student-receives-f-for-tor-failure-while-sending-anonymous-bomb-threat/">Runa’s analysis</a> of the Harvard Bomb Hoax</p>
]]></content>
</entry>
<entry>
<title type="html"><![CDATA[Yardbird's Effective Usenet Tradecraft]]></title>
<link href="http://grugq.github.com/blog/2013/12/01/yardbirds-effective-usenet-tradecraft/"/>
<updated>2013-12-01T07:15:00+07:00</updated>
<id>http://grugq.github.com/blog/2013/12/01/yardbirds-effective-usenet-tradecraft</id>
<content type="html"><![CDATA[<h1>Survival in an Extremely Adversarial Environment</h1>
<blockquote><p>If your secure communications platform isn’t being used by terrorists and pedophiles, you’re probably doing it wrong. – [REDACTED]</p></blockquote>
<p>A few years ago a group of child pornographers was infiltrated by police who were
able to monitor, interact, and aggressively investigate the members. Despite
engaging in a 15 month undercover operation, only one in three of the pedophiles
were successfully apprehended. The majority, including the now infamous leader
<em>Yardbird</em>, escaped capture. The dismal success rate of the law enforcement
officials was due entirely to the strict security rules followed by the group.</p>
<p>This post will examine those rules, the reasons for their success, and the
problems the group faced which necessitated those rules.</p>
<p>(An examination of the group’s security from a slightly different perspective
was conducted by <code>Baal</code> and is available <a href="http://dee.su/uploads/baal.html">here</a>)</p>
<h2>Covert Organizations, Seen One, Seen ‘em All</h2>
<p>All covert organizations face a similar set of problems as they attempt to execute
on their fundamental mission – to <strong>continue to exist</strong>. A covert organization
in an adversarial environment faces a number of organizational challenges and
constraints. Fundamentally how it handles trade-offs between operational
security and efficiency mandates how group members perform their operational
activities. Strong OPSEC means low efficiency, while high efficiency necessitates
weak OPSEC. The strength of the oppositional forces dictate the minimum security
requirements of the covert organization.</p>
<p>Examining the operational activities – those actions the organization must engage
in to self perpetuate – allows us to evaluate their operational security decisions
within their environmental context.</p>
<h3>Operational Activities:</h3>
<p>The <em>Yardbird</em> child abuse content group (hereafter also called the <em>enterprise</em>)
had a number of core goals that had to be addressed to continue operation: they
needed to distribute their child abuse content to members; communicate between
members; raise funds to acquire new content; recruit new members (presumably for
access to additional child abuse content).</p>
<p>Explicitly stated, this is an enumerated list of the operational activities that
the group <strong>had</strong> to engage in to self perpetuate.</p>
<ol>
<li>Distribution of Child Abuse Content</li>
<li>Communication and Coordinate Action</li>
<li>Fund raising</li>
<li>Recruitment and Vetting</li>
</ol>
<p>Except for the first issue (strategically significant only to this group),
these are pretty typical activities for a clandestine organization. Besides their
defining operational activity, they need a communications channel, fund raising
capability, and membership management processes.</p>
<h2>Opposition Success: The Penetration</h2>
<p>The law enforcement authorities caught a pedophile distribution child abuse
content. He is a member of the <em>Yardbird</em> group and offers up complete access
to the group, along with archival logs, in exchange for leniency.</p>
<p>All of the information about this group comes from the <a href="https://drive.google.com/file/d/0Bzt0K7_O4qyqMi0ycjZFdzhxYTA/edit?usp=sharing">Castleman Affidavit</a>,
the <code>Baal</code> <a href="http://dee.su/uploads/baal.html">analysis</a>, and some <code>Baal</code> <a href="http://alt.privacy.anon-server.narkive.com/VRl3dTFH/the-fourth-of-yardbird-s-chums-grabs-his-ankles">follow ups</a>.</p>
<h2>A Frustrating Infiltration</h2>
<p>The law enforcement authorities were about to completely penetrate the <em>enterprise</em>
for a 15 month period from 2006-08-31 through 2007-12-15. During that time the
group’s posted 400,000 images and 1,1000 videos. The <em>enterprise</em> had approximately
45 active members, although <a href="http://alt.privacy.anon-server.narkive.com/VRl3dTFH/the-fourth-of-yardbird-s-chums-grabs-his-ankles">independent observers</a> have claimed this is low with
the real membership anywhere from 48 to 61.</p>
<p>The total number of arrests was 14, or somewhere around 1/3rd. A fully staffed,
highly motivated, well trained adversarial force with complete penetration of
a large complacent group was only about to achieve a one in three success rate.
The majority of those successes were achieved due to group members being insufficiently