MalPipe – Malware/IOC Ingestion And Processing Engine

MalPipe is a modular malware (and indicator) collection and processing framework. It is designed to pull malware, domains, URLs and IP addresses from multiple feeds, enrich the collected data and export the results.At this time, the following feeds are supported:VirusTotal (https://www.virustotal.com)MalShare (https://malshare.com/)BambenekFeeds (osint.bambenekconsulting.com/feeds/)FeodoBlockList (https://feodotracker.abuse.ch)Malc0deIPList (http://malc0de.com/)NoThinkIPFeeds (www.nothink.org/)OpenPhishURLs (https://openphish.com)TorNodes (https://torstatus.blutmagie.de)Getting StartedThese instructions will get you a copy of the project up and running on your local machine for development and testing purposes. See deployment for notes on how to deploy the project on a live system.InstallingDeployment of MalPipe requires installing the required python libraries and configuring the various modules.Python dependencies can be installed by running:pip install -r requirements.txtConfiguringFeedsAn example configuration is provided in config_example.json with settings to get started. This file contains a JSON object containing the required settings for each feed / processor / exporter. An description of a feeds settings are shown below:… “feeds": {… "MalShare": { "ENABLED" : true, "API_KEY" : "00000000000000000000000000000000000000000000000000000000000", "EXPORTERS" : ["DetailsPrinter", "JSONLog"], "PROCESSORS" : ["YaraScan", "DNSResolver"] },…As some feeds update daily, feeds can be in two forms: scheduled and active. Settings for when these should run is defined outside of the configuration in the individual modules.ProcessorsProcessors are used to enrich/standardize the collected. For example, data from VirusTotal contains yara results for each file collected, whereas MalShare does not. By adding, YaraScan to the PROCESSORS key, you can scan the files to also include this data.An example modules settings are below:… "processors": { … "YaraScan": { "ENABLED" : false, "RULES_PATH": "/yara_rules/Malware.yara" }, …Currently, the following processors have been implemented:ASNLookupDNSResolverFileTypeRDNSYaraScanExportersThe final components is exporters, these control where the data goes. These can be used to export collected data to a malware repository, a SIEM, JSON Log files or printed for the user. … "exporters": { … "JSONLog": { "ENABLED" : true, "PRETTY" : true, "LOG_PATH": "./temp/" }, …Currently, the following processors have been implemented:DetailsPrinterGenericWebStorageJSONLogLocalFileStorageRunningAfter setup, MalPipe can be run by using:python malpipe.pyDeveloping ModulesModules for MalPipe located under malpipe/ by type:FeedsProcessorsExportersCreating new modules is easy,Create Python ModuleMalPipe modules are defined as Python classes. Following is an example Module headerclass ModuleName(Processor): def __init__(self): md = ProcessorDescription( module_name="ModuleName", description="Description", authors=["Author Name"], version="VersionNumber" ) Processor.__init__(self, md) self.types = [‘ipaddresses’] self.parse_settings()Settings can be set by importing the configuration and set to class variables, shown below: from malpipe.config import CONFIG … self.yara_rule_path = CONFIG[‘processors’][self.get_module_name()][‘RULES_PATH’]Each processor is required to have a run function that is called by the feed.Add SettingsAfter creation of the module, settings need to be added to are config.json under the processors, feeds , or exporters key. If the new module is a processor or exporter, it will also need to be added to the associated feeds. An example is shown below: … "processors": { … "SuperNewModule": { "ENABLED" : true, "DOCOOLSTUFF": true }, … "feeds": { … "0DayMalwareFeed": { "ENABLED" : true, "EXPORTERS" : ["DetailsPrinter", "JSONLog"], "PROCESSORS" : ["SuperNewModule"] } …ContributingPlease report any problems by creating a issue or starting a pull request. If you have additional modules or features you would like to see, please consider opening an issue.AuthorsSilas Cutler – GitHub | Twitter |See also the list of contributors who participated in this project. Download MalPipe

Link: http://feedproxy.google.com/~r/PentestTools/~3/Zo3edExBymM/malpipe-malwareioc-ingestion-and.html

RTA (Red Team Arsenal) – An Intelligent Scanner To Detect Security Vulnerabilities In Companies Layer 7 Assets

Red Team Arsenal is a web/network security scanner which has the capability to scan all company’s online facing assets and provide an holistic security view of any security anomalies. It’s a closely linked collections of security engines to conduct/simulate attacks and monitor public facing assets for anomalies and leaks.It’s an intelligent scanner detecting security anomalies in all layer 7 assets and gives a detailed report with integration support with nessus. As companies continue to expand their footprint on INTERNET via various acquisitions and geographical expansions, human driven security engineering is not scalable, hence, companies need feedback driven automated systems to stay put.InstallationSupported PlatformsRTA has been tested both on Ubuntu/Debian (apt-get based distros) and as well as Mac OS. It should ideally work with any linux based distributions with mongo and python installed (install required python libraries from install/py_dependencies manually).Prerequisites:There are a few packages which are necessary before proceeding with the installation:Git client: sudo apt-get install gitPython 2.7, which is installed by default in most systemsPython pip: sudo apt-get install python-pipMongoDB: Read the official installation guide to install it on your machine.Finally run python install/install.pyThere are also optional packages/tools you can install (highly recommended):Integrating Nessus:Integrating Nessus into Red Team Arsenal can be done is simple 3 steps:Download and install Nessus community edition (if you don’t have a paid edition). If you already have an installation (it can be remote installation as well), then go to step (2). Update the config file (present on the root directory of RTA) with Nessus URL, username and password. Create a nessus policy where you can configure the type of scans and plugins to run and name it RTA (Case sensitive – use full uppercase). Once the config file has the correct Nessus information (url, username, password), use the flag –nessus while running RTA to launch nessus scan over the entire subdomains gathered by RTA (one single scan initiated with all the subdomains gathered). Usage Short Form Long Form Description -u –url Domain URL to scan -v –verbose Enable the verbose mode and display results in realtime -n –nessus Launch a Nessus scan with all the subdomains -s –scraper Run scraper based on config keywords -h –help show the help message and exit Sample Outputa0xnirudh@exploitbox /RTA (master*) $ python rta.py –url “0daylabs.com" -v -s ____ _ _____ _ _ | _ \ ___ __| | |_ _|__ __ _ _ __ ___ / \ _ __ ___ ___ _ __ __ _| | | |_) / _ \/ _` | | |/ _ \/ _` | ‘_ ` _ \ / _ \ | ‘__/ __|/ _ \ ‘_ \ / _` | | | _ < __/ (_| | | | __/ (_| | | | | | | / ___ \| | \__ \ __/ | | | (_| | | |_| \_\___|\__,_| |_|\___|\__,_|_| |_| |_| /_/ \_\_| |___/\___|_| |_|\__,_|_|[i] Checking for Zonetransfer[i] Zone Transfer is not enabled[i] Checking for SPF records[+] SPF record lookups is good. Current value is: 9[-] Enumerating subdomains now for 0daylabs.com[-] Searching now in Baidu..[-] Searching now in Yahoo..[-] Searching now in Google..[-] Searching now in Bing..[-] Searching now in Ask..[-] Searching now in Netcraft..[-] Searching now in DNSdumpster..[-] Searching now in Virustotal..[-] Searching now in ThreatCrowd..[-] Searching now in SSL Certificates..[-] Searching now in PassiveDNS..[-] Total Unique Subdomains Found: 3blog.0daylabs.comwww.0daylabs.comtest.0daylabs.com[+] Verifying Subdomains and takeover options[+] Possible subdomain takeovers (Manual verification required): test.0daylabs.com[i] Verified and Analyzed Subdomains:[i] URL: blog.0daylabs.com[i] Wappalyzer: [u'jQuery', u'Varnish', u'Font Awesome', u'Twitter Bootstrap', u'Google Analytics', u'Google Font API', u'Disqus', u'Google AdSense'][i] Scraper Results[+] ShodanHostname: test.0daylabs.com IP: 139.59.63.111 Ports: 179Hostname: test.0daylabs.com IP: 139.59.63.111 Ports: 179[+] TwitterURL: https://twitter.com/tweetrpersonal9/status/832624003751694340 search string: 0daylabsURL: https://twitter.com/ratokeshi/status/823957535564644355 search string: 0daylabsNotificationsConfiguring Slack:RTA can also do push notifications to slack which includes the main scan highlight along with Nessus and other integrated scanner reports divided on the basis of severity.In your slack, create an incoming webhook and point it to the channel where you need the RTA to send the report. You can read more about creating incoming webhooks on slack documentation. In the config file, update the URL in the slack section with full URL (including https://) for the incoming webhook. Once slack is configured, you will automatically start getting reports on your configured slack channelRoadmapHere are couple of ideas which we have in mind to do going ahead with RTA. If you have any ideas/feature requests which is not listed below, feel free to raise an issue in github.Email the results once the scan is completed. Extend the current RTA API so that we can launch custom scans with required options via the API. Launch custom scans based on Wappalyzer results (eg: wpscan if wordpress is detected) Investigate and integrate more web security scanners including but not limited to Arachni, Wapiti, Skipfish and others ! JSON/XML output formatting for the RTA scan result. Improving the logic for Subdomain takeover. Multi threading support for faster scan comple. ContributorsAwesome people who built this project:Lead Developers:Anirudh Anand (@a0xnirudh)Project Contributors:Mohan KK (@MohanKallepalli)Ankur Bhargava (@_AnkurB)Prajal Kulkarni (@prajalkulkarni)Himanshu Kumar Das (@mehimansu)Special ThanksSublist3rDownload RTA

Link: http://feedproxy.google.com/~r/PentestTools/~3/MXF7YfYc5U8/rta-red-team-arsenal-intelligent.html

Git-All-Secrets – A Tool To Capture All The Git Secrets By Leveraging Multiple Open Source Git Searching Tools

git-all-secrets is a tool that can:Clone multiple public/private github repositories of an organization and scan them,Clone multiplepublic/private github repositories of a user that belongs to an organization and scan them,Clone a single public/private repository of an organization and scan it,Clone a single public/private repository of a user and scan it,Clone a single public/secret gist of a user and scan itClone a team’s repositories in an organization and scan them,All of the above together!! Oh yeah!! Simply provide an organization name and get all their secrets. If you also want to get secrets of a team within an organization, just mention the team name along with the org.Clone and scan Github Enterprise repositories and gists as well.Scanning is done by multiple open source tools such as:truffleHog – scans commits for high entropy strings and user provided regular expressions,repo-supervisor – scans for high entropy strings in .js and .json filesNOTE – More such tools can be added in future, if desired! NOTE – Scanning can be done by all the tools or any one of them by specifying the toolName flag.If all the tools are used to scan, the final output from the tool combines the output from all files from all the tools into one consolidated output file.Getting startedThe easiest way to run git-all-secrets is via Docker and I highly recommend installing Docker if you don’t already have it. Once you have Docker installed,Type docker run –rm -it abhartiya/tools_gitallsecrets –help to understand the different flags it can take as inputs.Once you know what you want to scan, type something like docker run -it abhartiya/tools_gitallsecrets -token=<> -org=<>. You can also specify a particular tool to use for scanning by typing something like docker run -it abhartiya/tools_gitallsecrets -token=<> -org=<> -toolName=<>. Options are thog and repo-supervisor.If you want to run truffleHog with the default regex AND the high entropy settings, provide the thogEntropy flag like this – docker run -it abhartiya/tools_gitallsecrets -token=<> -org=<> -toolName=thog -thogEntropy.After the container finishes running, retrieve the container ID by typing docker ps -a.Once you have the container ID, get the results file from the container to the host by typing docker cp <container-id>:/data/results.txt .Flags/Options-token = Github personal access token. We need this because unauthenticated requests to the Github API can hit the rate limiting pretty soon! -org = Name of the Organization to scan. This will scan all public repos in the org + all the repos & gists of all users in the org. If you are using a token of a user who is a part of this org, it will also clone and scan all the secret gists belonging to that user as well as all the private repos in that org that the user has access to. However, it will NOT clone and scan any private repositories of this user belonging to this org. To scan private repositories of users, please use the scanPrivateReposOnly flag with the user flag along with the SSH key mounted on a volume. -user = Name of the User to scan. This will scan all the repos & gists of this user. If the token provided is the token of the user, secret gists will also be cloned and scanned. But, only public repos will be cloned and scanned. To scan private repositories of this user, please use the scanPrivateReposOnly flag with the user flag along with the SSH key mounted on a volume. -repoURL = HTTPS URL of the Repo to scan. This will scan this repository only. For public repos, mentioning the https URL of the repo will suffice. However, if you wish to scan a private repo, then you need to provide the ssh URL along with the SSH key mounted on a volume and the scanPrivateReposOnly flag. -gistURL = HTTPS URL of the Gist to scan. This will scan this gist only. There is no concept of public or secret gist as long as you have the URL. Even if you have a secret gist, if someone knows the HTTPS URL of your secret gist, they can access it too. -output = This is the name of the file where all the results will get stored. By default, this is results.txt. -cloneForks = This is the optional boolean flag to clone forks of org and user repositories. By default, this is set to 0 i.e. no cloning of forks. If forks are to be cloned, this value needs to be set to 1. Or, simply mention -cloneForks along with other flags. -orgOnly = This is the optional boolean flag to skip cloning user repositories belonging to an org. By default, this is set to 0 i.e. regular behavior. If user repo’s are not to be scanned and only the org repositories are to be scanned, this value needs to be set to 1. Or, simply mention -orgOnly along with other flags. -toolName = This is the optional string flag to specify which tool to use for scanning. By default, this is set to all i.e. thog and repo-supervisor will all be used for scanning. Values are either thog or repo-supervisor. -teamName = Name of the Organization Team which has access to private repositories for scanning. This flag is not fully tested so I can’t guarantee the functionality. -scanPrivateReposOnly = This is the optional boolean flag to specify if you want to scan private user repositories or not. Mentioning this will NOT scan public user repositories. And, you need to provide the SSH key by mounting the volume onto the container. Also, this only works with either the user flag, the repoURL flag or the org flag.When the org flag is mentioned along with the scanPrivateReposOnly flag and without the orgOnly flag, it will scan the public AND the private repos belonging to this org to which the user has access to (whose token is provided). It will then continue to scan ONLY the private repositories of the user (whose token is provided). Finally, it will continue to scan all public and secret gists of this user (whose token is provided). In a nutshell, the scanPrivateReposOnly flag only really affects the user and the repoURL flag. -enterpriseURL = Optional flag to provide the enterprise Github URL, if you wish to scan enterprise repositories. It should be something like https://github.org.com/api/v3 along with the SSH key mounted onto the container. Refer to scanning github enterprise below. -threads = Default value is 10. This is to limit the number of threads if your system is not beefy enough. For the most part, leaving this to 10 should be okay. -thogEntropy = This is an optional flag that basically tells if you want to get back high entropy based secrets from truffleHog or not. The high entropy secrets from truffleHog produces a LOT of noise so if you don’t really want all that noise and if you are running git-all-secrets on a big organization, I’d recommend not to mention this flag. By default, this is set to False which means truffleHog will only produce result based on the Regular expressions in the regexChecks.py file. If you are scanning a fairly small org with a limited set of repos or a user with a few repos, mentioning this flag makes more sense. NoteThe token flag is compulsory. This can’t be empty. The org, user, repoURL and gistURL can’t be all empty at the same time. You need to provide just one of these values. If you provide all of them or multiple values together, the order of precendence will be org > user > repoURL > gistURL. For instance, if you provide both the flags -org=secretorg123 and -user=secretuser1 together, the tool will complain that it doesn’t need anything along with the org value. To run it against a particular user only, just need to provide the user flag and not the org flag. When specifying the scanPrivateReposOnly flag: One must mount a volume containing the private SSH key onto the Docker container using the -v flag.It should be used anytime a private repository is scanned. Please use the ssh url when using the flag and not the https URL.Please make sure the token being used actually belongs to the user whose private repository/gist you are trying to scan otherwise there will be errors.The SSH key that you will be using should NOT have a passphrase set if you want this tool to work without any manual intervention.Refer to scanning private repositories below. When specifying teamName it is important that the provided token belong to a user which is a member of the team. Unexpected results may occur otherwise. Refer to scanning an organization team below. When specifying the enterpriseURL flag, it will always consider the SSH url even if you provide the https url of a repository. All the enterprise cloning/scanning happens via the ssh url and not the https url. As mentioned above, make sure the SSH key being used (to scan the ssh URL) does not have any passphrase set. Scanning Private RepositoriesThe most secure way to scan private repositories is to clone using the SSH URLs. To accomplish this, one needs to place an appropriate SSH key which has been added to a Github User. Github has helpful documentation for configuring your account. Make sure this key does not have any passphrase set on it. Once you have the SSH key, simply mount it to the Docker container via a volume. It is as simple as typing the below commands:docker run -it -v ~/.ssh/id_rsa_personal:/root/.ssh/id_rsa abhartiya/tools_gitallsecrets -token=<> -user=<> -scanPrivateReposOnlyORdocker run -it -v ~/.ssh/id_rsa_personal:/root/.ssh/id_rsa abhartiya/tools_gitallsecrets -token=<> -repoURL=<> -scanPrivateReposOnlyHere, I am mapping my personal SSH key id_rsa_personal stored locally to /root/.ssh/id_rsa inside the container so that git-all-secrets will try to clone the repo via ssh and will use the SSH key stored at /root/.ssh/id_rsa inside the container. This way, you are not really storing anything sensitive inside the container. You are just using a file from your local machine. Once the container is destroyed, it no longer has access to this key.Scanning an Organization TeamThe Github API limits the circumstances where a private repository is reported. If one is trying to scan an Organization with a user which is not an admin, you may need to provide the team which provides repository access to the user. In order to do this, use the teamName flag along with the org flag.Example is below:docker run –it -v ~/.ssh/id_rsa_personal:/root/.ssh/id_rsa abhartiya/tools_gitallsecrets -token=<> -org=<> -teamName <>Scanning Github Enterprisegit-all-secrets now supports scanning Github Enterprise as well. If you have your own Github Enterprise hosted behind a VPN or something, make sure you are connected on the VPN or on the correct network that has access to the Github Enterprise repos. The enterpriseURL is what you’d need to scan your Github Enterprise repos. Below are some examples:Example 1:docker run -it -v ~/.ssh/id_rsa_gitenterprise:/root/.ssh/id_rsa -token <token> -enterpriseURL https://github.<org>.com/api/v3 -repoURL https://github.<org>.com/<user>/<repo>.gitHere, I am now mounting my github enterprise SSH key onto the container, followed by my personal access token, the enterprise URL to which the requests will be sent and the repo I want to scan.Example 2:docker run -it -v ~/.ssh/id_rsa_gitenterprise:/root/.ssh/id_rsa -token <token> -enterpriseURL https://github.<org>.com/api/v3 -repoURL https://github.<org>.com/<user>/<repo>.git -toolName thog -thogEntropyAbove, I am now just running truffleHog against the repository with the Entropy settings.Example 3:docker run -it -v ~/.ssh/id_rsa_gitenterprise:/root/.ssh/id_rsa -token <token> -enterpriseURL https://github.<org>.com/api/v3 -user <username> -scanPrivateReposOnlyAbove, I am scanning only the private repositories of the user whose token is provided with all the tools (repo-supevisor and thog), but without the entropy setting of truffleHog.DetailsFeaturesYou can add your own regular expressions in the regexChecks.py file and include it when executing docker run using the argument -v $(pwd)/regexChecks.py:/data/truffleHog/truffleHog/regexChecks.py.The tool looks for some default regular expressions. If needed, it can also be made for high entropy strings. All this happens via the truffleHog tool.It can look for high entropy strings in .js and .json files via the repo-supervisor tool.It scans users gists, which most of the tools dont.If there is a new tool that is good, it can be integrated into git-all-secrets pretty effortlessly.It is built for integration with other tools and frameworks. It takes in a few input parameters and produces an output file of the results. Pretty straightforward!It supports scanning Github Enterprise orgs/users/repos/gists as well.Most of the tools out there are made to scan individual repositories. If you want to loop it over multiple repositories, you’d have to write your own for loop in a shell script or something like that. git-all-secrets can help you scan multiple repositories at one go.MotivationI looked at a large number of open source tools that could be potentially used to look for secrets in github repositories. Some of the top tools that I thought were good are: gitrob, truffleHog and git-secrets.Gitrob is meant to be a standalone tool that is pretty difficult to integrate with other tools because it has its own database and UI to see all the secrets discovered. It also produces a ton of false positives, more than truffleHog. And, it doesn’t really highlight the secrets discovered. It just looks at the files and their extensions, not the actual content. So, although Gitrob is a great tool to get started with, I would recommend running it every once in a while to understand what the attack surface looks like and see if it has changed.Then, there is truffleHog that looks for secrets in the actual contents of the file by looking at Shannon’s entropy and prints the output on the screen. It takes in a repository URL or a repository directory as an argument. This is a pretty good tool although it does have its share of false positives. Some of the other drawbacks are:We can’t use it recursively to scan directories that contain multiple repositories.There is no way we can use truffleHog to identify secrets that follow a certain pattern but don’t have a high enough entropy i.e. we can’t make it look for secrets that we know of but not necessarily have high entropy to be considered as a secret.It prints the output on the screen so not really useful for automation as such.Finally, there is git-secrets which can flag things like AWS secrets. The best part is that you can add your own regular expressions as well for secrets that you know it should be looking for. A major drawback is that it doesn’t do a good job on finding high entropy strings like truffleHog does. You can also only scan a particular directory that is a repository so no recursion scanning from a directory of repositories either.So, as you can see, there are decent tools out there, but they had to be combined somehow. There was also a need to recursively scan multiple repositories and not just one. And, what about gists? There are organizations and users. Then, there are repositories for organizations and users. There are also gists by users. All of these should be scanned. And, scanned such that it could be automated and easily consumed by other tools/frameworks.Changelog12/12/17 – For some large repos, truffleHog fails and exits. But, we don’t want to stop there. We want to notify the user that scanning failed for that repo and continue scanning the other repos. This is now implemented in the latest docker image. 12/11/17 – Removed gitsecrets because truffleHog supports regex functionality now. Simply, adding your regexes in the regexChecks.py file and rebuilding the Docker image will basically give us the functionality that gitsecrets was giving previously so there is no need for gitsecrets anymore. I also added support for scanning Github Enterprise repos & gists. @high-stakes helped get a PR in that (hopefully) fixes the Goroutine bug by limiting the amount of threads. Finally, support for scanning private repositories for an organization was added as well. 12/08/17 – Removed my own fork of truffleHog. Using the upstream version now along with the new regex functionality of truffleHog + entropy mode. Soon, I believe we can replace both gitsecrets and repo supervisor by just truffleHog once some issues are fixed. 12/07/17 – I updated the documentation with some more details and explanation around the different flags. 12/05/17 – Integrated scanning support for private repositories via SSH key. This has been an ask for the longest time and it is now possible to do so. Also, changed the docker image tag scheme. From now on, the latest image will have the latest tag. And, all the previous versions will be tagged with a number. All this couldn’t have been possible without the SimpliSafe team, specially Matthew Cox (https://github.com/matthew-cox). So, a big shoutout to you Matt! 10/14/17 – Built and pushed the new image abhartiya/tools_gitallsecrets:v6. This new image has the newer version of git-secrets as well as repo-supervisor i.e. I merged some upstream changes into my fork alongwith some additional changes I had already made in my fork. The new image uses these changes so everything is latest and greatest! 10/14/17 – Built and pushed the new image abhartiya/tools_gitallsecrets:v5. This image fixes a very stupid and irritating bug which was possibly causing repo supervisor to fail. Something changed in the way Environment values are being read in Dockerfile which resulted in repo supervisor not understanding which node path to use. Node hell! 9/29/17 – Built and pushed the new image with the orgOnly flag – abhartiya/tools_gitallsecrets:v4 8/22/17 – Added -orgOnly toggle by kciredor: analyzes specified organization repo and skips user repo’s. 6/26/17 – Removed some output in repo-supevisor that printed out errors when there were no secrets found. Unnecessary output! Built and pushed the new image – abhartiya/tools_gitallsecrets:v3 6/25/17 – Added the flag toolName to specify which tool to use for scanning. Built and pushed the new image – abhartiya/tools_gitallsecrets:v2 6/14/17 – Added repo-supervisor as a scanning tool, also updated and added the version number to the docker image – abhartiya/tools_gitallsecrets:v1 6/14/17 – Added the flag cloneForks to avoid cloning forks of org and user repos. By default, this is false. If you wish to scan forks, just set the value to 1 i.e. -cloneForks=1 Download Git-All-Secrets

Link: http://feedproxy.google.com/~r/PentestTools/~3/H-6r96VjPQQ/git-all-secrets-tool-to-capture-all-git.html

M0B-tool – Auto Detect CMS And Exploit

Tool to auto detect CMS and exploit.Features:Bing dork scanner by domainDork by countryBRUTE FORCE [WordPress (auto scrap name) – Joomla – Drupal – Opencart – Magento]Shell finderIp scanner and brute forceAuto detect cms and exploitRunperl MENU.plInstallgit clone https://github.com/mobrine-mob/M0B-tool.gitscript created by M0BBING DORK SCANNERWHEN THE SCAN END , your gonna find all urls in /resultDORK BY COUNTRYyou put a dorkand you get a list of dork like this :example?lotid=+site:acexample?lotid=+site:adexample?lotid=+site:aeexample?lotid=+site:afexample?lotid=+site:agexample?lotid=+site:aiexample?lotid=+site:alexample?lotid=+site:amexample?lotid=+site:anexample?lotid=+site:aoexample?lotid=+site:aqexample?lotid=+site:arexample?lotid=+site:asexample?lotid=+site:atexample?lotid=+site:auexample?lotid=+site:awexample?lotid=+site:axexample?lotid=+site:azexample?lotid=+site:baexample?lotid=+site:bbexample?lotid=+site:bdWordPress – Joomla – Drupal – Opencart – Magento BRUTE FORCEIf you want to brute force with your own passwords list change the list name to passwords.txtNote that your going to find the good ones in Result.txt on the main M0B folderIP SCANNER & brute force for ssh , ftp and the prevent brute forceIt’s need more work , you can find some bugsAUTO DETECT CMS AND EXPLOITWARNING:If you want to exploit drupal (add admin you need to upload drupal.php in your localhost or upload it in your shell and edit it in the script)Download M0B-tool

Link: http://feedproxy.google.com/~r/PentestTools/~3/hHmVFacJIao/m0b-tool-auto-detect-cms-and-exploit.html

Envizon – Network Visualization Tool With Focus On Red / Blue Team Requirements

This tool is designed, developed and supported by evait security. In order to give something back to the security community, we publish our internally used and developed, state of the art network visualization and organization tool, ‘envizon’. We hope your feedback will help to improve and hone it even further.Core Features:Scan networks with predefined or custom nmap queriesOrder clients with preconfigured or custom groupsSearch through all attributes of clients and create complex linked queriesGet an overview of your targets during pentests with predefined security labelsSave and reuse your most used nmap scansCollaborate with your team on the project in realtimeExport selected clients in a text file to connect other tools fastHow to start?!To avoid compatibility and dependency issues, and to make it easy to set up, we use Docker. You can build your own images or use prebuilt ones from Docker Hub.Using DockerDocker and Docker Compose are required.git clone https://github.com/evait-security/envizoncd envizon# Create self-signed certificates:mkdir .sslopenssl req -x509 -sha256 -nodes -newkey rsa:2048 -days 365 -keyout .ssl/localhost.key -out .ssl/localhost.crt# If you want to use certificates located elsewhere, provide their pathes with SSL_CERT_PATH and SSL_KEY_PATH# Create a secret, if you have rails installed locally you can just use:rails secret# otherwise, use openssl:openssl rand -hex 64# this needs to be provided either as an environment variable (SECRET_KEY_BASE), or added in the docker-compose.ymlsudo docker-compose upDevelopmentIf, for whatever reason, you want to run the development environment in production, you should probably consider changing the secrets in config/secrets.yml, and maybe even manually activate SSL.git clone https://github.com/evait-security/envizoncd envizonsudo docker-compose -f docker-compose-development.yml upRunning tests:docker exec -it envizon_container_name_1 /bin/ash -c ‘rails test’Without DockerRequires a PostgreSQL server.Create a database envizon with a user envizon. Password and socket location can be modified in the docker-compose.yml. Your user needs SUPERUSER privileges; otherwise database import (and tests) won’t work, because of foreign key constraints: use ALTER USER user WITH SUPERUSER;.git clone https://github.com/evait-security/envizoncd envizonbundle install –path vendor/bundleYou need to create a secret and SSL certificates, as described above.Then, run it with:RAILS_ENV=productionexport RAILS_ENVSECRET_KEY_BASE=YOUR_SECRETexport SECRET_KEY_BASEbundle exec rails db:setupbundle exec rails db:migratebundle exec rails db:seedbundle exec rails assets:precompileRAILS_FORCE_SSL=true RAILS_SERVE_STATIC_FILES=true bundle exec rails sDevelopmentDatabases for development and testing are called envizon_test and envizon_development, with the same requirements as above. Different database names and credentials can be provided via environment variables or directly modified in config/database.ymlgit clone https://github.com/evait-security/envizoncd envizonbundle install –path vendor/bundleRAILS_ENV=developmentexport RAILS_ENVbundle exec rails db:setupbundle exec rails db:migratebundle exec rails db:seedbundle exec rails sTo run the tests:RAILS_ENV=test db:setupbundle exec rails testStart with prebuilt images and postgresql docker imageComing Soon™Set a passwordAfter starting the docker images go to: https://localhost:3000/ (or http://localhost:3000 if not using SSL)You have to specify a password for your envizon instance. You can change it in the settings interface after logging in.Scan interfaceThe scan interface is divided in two sections. On the left side you can run a new scan with preconfigured parameters or your own nmap fu. You also have the possibility to upload previously created nmap scans (with the -oX parameter).On the right side you will see your running and finished scans.GroupsThe group interface is the heart of envizon. You can select, group, order, quick search, global search, move, copy, delete and view your clients. The left side represents the group list. If you click on a group you will get a detailed view in the center of the page with the group content. Each client in a group has a link. By clicking on the IP address you will get a more detailed view on the right side with all attributes, labels, ports and nmap output.Most of the buttons and links have tooltips.Global SearchIn this section you can search for nearly anything in the database and combine each search parameter with ‘AND’, ‘OR’ & ‘NOT’.Perform simple queries for hostname, IP, open ports, etc. or create combined queries like: hostname contains ‘win’ AND mac address starts with ‘0E:5C’ OR has port 21 and 22 open.FAQAPI ?!Currently not. We will work on it. Maybe.Which browsers are supported?Latest Chrome / Chromium / Inox & Firefox / Waterfox.Why rails?!Wanted to learn ruby. It’s cool.Why so salty on github issue discussion?This is a community project. We are a full time pentesting company and will not go into / care about every open issue that doesn’t match our template or guidelines. If you get a rough answer or picture, you probably deserved it.What frameworks and tools were used?Ruby on Railsruby-nmap (https://github.com/sophsec/ruby-nmap)Materialize CSS (http://materializecss.com/)Fontawsome Icons (https://fontawesome.com/)Material Icons (https://material.io/icons/)Many, many helpful gemsHelp?You can get some information about the structure and usage on the official wiki.https://github.com/evait-security/envizon/wikiDownload Envizon

Link: http://feedproxy.google.com/~r/PentestTools/~3/U_4aFfRhUhY/envizon-network-visualization-tool-with.html

Retire.Js – Scanner Detecting The Use Of JavaScript Libraries With Known Vulnerabilities

What you require you must also retireThere is a plethora of JavaScript libraries for use on the Web and in Node.JS apps out there. This greatly simplifies development,but we need to stay up-to-date on security fixes. “Using Components with Known Vulnerabilities" is now a part of the OWASP Top 10 list of security risks and insecure libraries can pose a huge risk to your Web app. The goal of Retire.js is to help you detect the use of JS-library versions with known vulnerabilities.Retire.js can be used in many ways:As command line scannerAs a grunt pluginAs a gulp taskAs a Chrome extensionAs a Firefox extensionAs a Burp and OWASP Zap pluginCommand line scannerScan a web app or node app for use of vulnerable JavaScript libraries and/or Node.JS modules. In the source code folder of the application folder run:$ npm install -g retire$ retireGrunt pluginA Grunt task for running Retire.js as part of your application’s build routine, or some other automated workflow.Gulp taskAn example of a Gulp task which can be used in your gulpfile to watch and scan your project files automatically. You can modify the watch patterns and (optional) Retire.js options as you like.var gulp = require(‘gulp’);var spawn = require(‘child_process’).spawn;var gutil = require(‘gulp-util’);gulp.task(‘retire:watch’, [‘retire’], function (done) { // Watch all javascript files and package.json gulp.watch([‘js/**/*.js’, ‘package.json’], [‘retire’]);});gulp.task(‘retire’, function() { // Spawn Retire.js as a child process // You can optionally add option parameters to the second argument (array) var child = spawn(‘retire’, [], {cwd: process.cwd()}); child.stdout.setEncoding(‘utf8’); child.stdout.on(‘data’, function (data) { gutil.log(data); }); child.stderr.setEncoding(‘utf8’); child.stderr.on(‘data’, function (data) { gutil.log(gutil.colors.red(data)); gutil.beep(); });});Chrome and firefox extensionsScans visited sites for references to insecure libraries, and puts warnings in the developer console. An icon on the address bar displays will also indicate if vulnerable libraries were loaded.Burp and OWASP ZAP plugin@h3xstream has adapted Retire.js as a plugin for the penetration testing tools Burp and OWASP ZAP. An alternative OWASP ZAP plugin exists at https://github.com/nikmmy/retire/Download Retire.Js

Link: http://feedproxy.google.com/~r/PentestTools/~3/cco1fXO00sQ/retirejs-scanner-detecting-use-of.html

CLOUDKiLL3R – Bypasses Cloudflare Protection Service Via TOR Browser

CLOUDKiLL3R bypasses Cloudflare protection service via TOR Browser !CLOUDKiLL3R Requirements :TOR Browser to scan as many sites as you want :)Python CompilerCLOUDKiLL3R Installation ?Make sure that TOR Browser is up and running while working with CLOUDKiLL3R .Make sure that the IP AND PORT are the same in TOR Browser preferences > advanced > NetworksInclude the files below in one folder :FILTER.txtCK.plMake Sure The Modules Below Are Installed If NOT > use this command to install one : pip install [module name]argparsesockssocketrequestssysContact :Twitter.com/moh_securityDownload CLOUDKiLL3R

Link: http://feedproxy.google.com/~r/PentestTools/~3/_6P2jr417H0/cloudkill3r-bypasses-cloudflare.html

WPSeku v0.4 – WordPress Security Scanner

WPSeku is a black box WordPress vulnerability scanner that can be used to scan remote WordPress installations to find security issues.Installation$ git clone https://github.com/m4ll0k/WPSeku.git wpseku$ cd wpseku$ pip3 install -r requirements.txt$ python3 wpseku.pyUsageGeneric Scanpython3 wpseku.py –url https://www.xxxxxxx.com –verboseOutput—————————————- _ _ _ ___ ___ ___| |_ _ _ | | | | . |_ -| -_| ‘_| | ||_____| _|___|___|_,_|___| |_| v0.4.0WPSeku – WordPress Security Scannerby Momo Outaadi (m4ll0k)—————————————-[ + ] Target: https://www.xxxxxxx.com[ + ] Starting: 02:38:51[ + ] Server: Apache[ + ] Uncommon header “X-Pingback" found, with contents: https://www.xxxxxxx.com/xmlrpc.php[ i ] Checking Full Path Disclosure…[ + ] Full Path Disclosure: /home/ehc/public_html/wp-includes/rss-functions.php[ i ] Checking wp-config backup file…[ + ] wp-config.php available at: https://www.xxxxxxx.com/wp-config.php[ i ] Checking common files…[ + ] robots.txt file was found at: https://www.xxxxxxx.com/robots.txt[ + ] xmlrpc.php file was found at: https://www.xxxxxxx.com/xmlrpc.php[ + ] readme.html file was found at: https://www.xxxxxxx.com/readme.html[ i ] Checking directory listing…[ + ] Dir "/wp-admin/css" listing enable at: https://www.xxxxxxx.com/wp-admin/css/[ + ] Dir "/wp-admin/images" listing enable at: https://www.xxxxxxx.com/wp-admin/images/[ + ] Dir "/wp-admin/includes" listing enable at: https://www.xxxxxxx.com/wp-admin/includes/[ + ] Dir "/wp-admin/js" listing enable at: https://www.xxxxxxx.com/wp-admin/js/……Bruteforce Loginpython3 wpseku.py –url https://www.xxxxxxx.com –brute –user test –wordlist wl.txt –verboseOutput—————————————- _ _ _ ___ ___ ___| |_ _ _ | | | | . |_ -| -_| ‘_| | ||_____| _|___|___|_,_|___| |_| v0.4.0WPSeku – WordPress Security Scannerby Momo Outaadi (m4ll0k)—————————————-[ + ] Target: https://www.xxxxxxx.com[ + ] Starting: 02:46:32[ + ] Bruteforcing Login via XML-RPC…[ i ] Setting user: test[ + ] Valid Credentials: —————————–| Username | Passowrd |—————————–| test | kamperasqen13 |—————————–Scan plugin,theme and wordpress codepython3 wpseku.py –scan

–verboseNote: Testing Akismet Directory Plugin https://plugins.svn.wordpress.org/akismetOutput—————————————- _ _ _ ___ ___ ___| |_ _ _ | | | | . |_ -| -_| ‘_| | ||_____| _|___|___|_,_|___| |_| v0.4.0WPSeku – WordPress Security Scannerby Momo Outaadi (m4ll0k)—————————————-[ + ] Checking PHP code…[ + ] Scanning directory…[ i ] Scanning trunk/class.akismet.php file———————————————————————————————————-| Line | Possibile Vuln. | String |———————————————————————————————————-| 597 | Cross-Site Scripting | [b"$_GET[‘action’]", b"$_GET[‘action’]"] || 601 | Cross-Site Scripting | [b"$_GET[‘for’]", b"$_GET[‘for’]"] || 140 | Cross-Site Scripting | [b"$_POST[‘akismet_comment_nonce’]", b"$_POST[‘akismet_comment_nonce’]"] || 144 | Cross-Site Scripting | [b"$_POST[‘_ajax_nonce-replyto-comment’]"] || 586 | Cross-Site Scripting | [b"$_POST[‘status’]", b"$_POST[‘status’]"] || 588 | Cross-Site Scripting | [b"$_POST[‘spam’]", b"$_POST[‘spam’]"] || 590 | Cross-Site Scripting | [b"$_POST[‘unspam’]", b"$_POST[‘unspam’]"] || 592 | Cross-Site Scripting | [b"$_POST[‘comment_status’]", b"$_POST[‘comment_status’]"] || 599 | Cross-Site Scripting | [b"$_POST[‘action’]", b"$_POST[‘action’]"] || 214 | Cross-Site Scripting | [b"$_SERVER[‘HTTP_REFERER’]", b"$_SERVER[‘HTTP_REFERER’]"] || 403 | Cross-Site Scripting | [b"$_SERVER[‘REQUEST_TIME_FLOAT’]", b"$_SERVER[‘REQUEST_TIME_FLOAT’]"] || 861 | Cross-Site Scripting | [b"$_SERVER[‘REMOTE_ADDR’]", b"$_SERVER[‘REMOTE_ADDR’]"] || 930 | Cross-Site Scripting | [b"$_SERVER[‘HTTP_USER_AGENT’]", b"$_SERVER[‘HTTP_USER_AGENT’]"] || 934 | Cross-Site Scripting | [b"$_SERVER[‘HTTP_REFERER’]", b"$_SERVER[‘HTTP_REFERER’]"] || 1349 | Cross-Site Scripting | [b"$_SERVER[‘REMOTE_ADDR’]"] |———————————————————————————————————-[ i ] Scanning trunk/wrapper.php file[ + ] Not found vulnerabilities[ i ] Scanning trunk/akismet.php file———————————————–| Line | Possibile Vuln. | String |———————————————–| 55 | Authorization Hole | [b’is_admin()’] |———————————————–[ i ] Scanning trunk/class.akismet-cli.php file[ + ] Not found vulnerabilities[ i ] Scanning trunk/class.akismet-widget.php file[ + ] Not found vulnerabilities[ i ] Scanning trunk/index.php file[ + ] Not found vulnerabilities[ i ] Scanning trunk/class.akismet-admin.php file——————————————————————————————————————–| Line | Possibile Vuln. | String |——————————————————————————————————————–| 39 | Cross-Site Scripting | [b"$_GET[‘page’]", b"$_GET[‘page’]"] || 134 | Cross-Site Scripting | [b"$_GET[‘akismet_recheck’]", b"$_GET[‘akismet_recheck’]"] || 152 | Cross-Site Scripting | [b"$_GET[‘view’]", b"$_GET[‘view’]"] || 190 | Cross-Site Scripting | [b"$_GET[‘view’]", b"$_GET[‘view’]"] || 388 | Cross-Site Scripting | [b"$_GET[‘recheckqueue’]"] || 841 | Cross-Site Scripting | [b"$_GET[‘view’]", b"$_GET[‘view’]"] || 843 | Cross-Site Scripting | [b"$_GET[‘view’]", b"$_GET[‘view’]"] || 850 | Cross-Site Scripting | [b"$_GET[‘action’]"] || 851 | Cross-Site Scripting | [b"$_GET[‘action’]"] || 852 | Cross-Site Scripting | [b"$_GET[‘_wpnonce’]", b"$_GET[‘_wpnonce’]"] || 868 | Cross-Site Scripting | [b"$_GET[‘token’]", b"$_GET[‘token’]"] || 869 | Cross-Site Scripting | [b"$_GET[‘token’]"] || 873 | Cross-Site Scripting | [b"$_GET[‘action’]"] || 874 | Cross-Site Scripting | [b"$_GET[‘action’]"] || 1005 | Cross-Site Scripting | [b"$_GET[‘akismet_recheck_complete’]"] || 1006 | Cross-Site Scripting | [b"$_GET[‘recheck_count’]"] || 1007 | Cross-Site Scripting | [b"$_GET[‘spam_count’]"] || 31 | Cross-Site Scripting | [b"$_POST[‘action’]", b"$_POST[‘action’]"] || 256 | Cross-Site Scripting | [b"$_POST[‘_wpnonce’]"] || 260 | Cross-Site Scripting | [b’$_POST[$option]’, b’$_POST[$option]’] || 267 | Cross-Site Scripting | [b"$_POST[‘key’]"] || 392 | Cross-Site Scripting | [b"$_POST[‘offset’]", b"$_POST[‘offset’]", b"$_POST[‘limit’]", b"$_POST[‘limit’]"] || 447 | Cross-Site Scripting | [b"$_POST[‘id’]"] || 448 | Cross-Site Scripting | [b"$_POST[‘id’]"] || 460 | Cross-Site Scripting | [b"$_POST[‘id’]", b"$_POST[‘url’]"] || 461 | Cross-Site Scripting | [b"$_POST[‘id’]"] || 464 | Cross-Site Scripting | [b"$_POST[‘url’]"] || 388 | Cross-Site Scripting | [b"$_REQUEST[‘action’]", b"$_REQUEST[‘action’]"] || 400 | Cross-Site Scripting | [b"$_SERVER[‘HTTP_REFERER’]", b"$_SERVER[‘HTTP_REFERER’]"] |——————————————————————————————————————–[ i ] Scanning trunk/class.akismet-rest-api.php file[ + ] Not found vulnerabilitiesCredits and ContributorsOriginal idea and script from WPScan Team (https://wpscan.org/)WPScan Vulnerability Database (https://wpvulndb.com/api)Download WPSeku

Link: http://feedproxy.google.com/~r/PentestTools/~3/Rw3WvFwygMM/wpseku-v04-wordpress-security-scanner.html

Nmap 7.70 – Free Security Scanner: Better service and OS detection, 9 new NSE scripts, new Npcap, and much more

Nmap (“Network Mapper") is a free and open source utility for network discovery and security auditing. Many systems and network administrators also find it useful for tasks such as network inventory, managing service upgrade schedules, and monitoring host or service uptime. Nmap uses raw IP packets in novel ways to determine what hosts are available on the network, what services (application name and version) those hosts are offering, what operating systems (and OS versions) they are running, what type of packet filters/firewalls are in use, and dozens of other characteristics. It was designed to rapidly scan large networks, but works fine against single hosts. Nmap runs on all major computer operating systems, and official binary packages are available for Linux, Windows, and Mac OS X. In addition to the classic command-line Nmap executable, the Nmap suite includes an advanced GUI and results viewer (Zenmap), a flexible data transfer, redirection, and debugging tool (Ncat), a utility for comparing scan results (Ndiff), and a packet generation and response analysis tool (Nping).Nmap was named “Security Product of the Year” by Linux Journal, Info World, LinuxQuestions.Org, and Codetalker Digest. It was even featured in twelve movies, including The Matrix Reloaded, Die Hard 4, Girl With the Dragon Tattoo, and The Bourne Ultimatum.FeaturesFlexible: Supports dozens of advanced techniques for mapping out networks filled with IP filters, firewalls, routers, and other obstacles. This includes many port scanning mechanisms (both TCP & UDP), OS detection, version detection, ping sweeps, and more. See the documentation page.Powerful: Nmap has been used to scan huge networks of literally hundreds of thousands of machines.Portable: Most operating systems are supported, including Linux, Microsoft Windows, FreeBSD, OpenBSD, Solaris, IRIX, Mac OS X, HP-UX, NetBSD, Sun OS, Amiga, and more.Easy: While Nmap offers a rich set of advanced features for power users, you can start out as simply as "nmap -v -A targethost". Both traditional command line and graphical (GUI) versions are available to suit your preference. Binaries are available for those who do not wish to compile Nmap from source.Free: The primary goals of the Nmap Project is to help make the Internet a little more secure and to provide administrators/auditors/hackers with an advanced tool for exploring their networks. Nmap is available for free download, and also comes with full source code that you may modify and redistribute under the terms of the license.Well Documented: Significant effort has been put into comprehensive and up-to-date man pages, whitepapers, tutorials, and even a whole book! Find them in multiple languages here.Supported: While Nmap comes with no warranty, it is well supported by a vibrant community of developers and users. Most of this interaction occurs on the Nmap mailing lists. Most bug reports and questions should be sent to the nmap-dev list, but only after you read the guidelines. We recommend that all users subscribe to the low-traffic nmap-hackers announcement list. You can also find Nmap on Facebook and Twitter. For real-time chat, join the #nmap channel on Freenode or EFNet.Acclaimed: Nmap has won numerous awards, including "Information Security Product of the Year" by Linux Journal, Info World and Codetalker Digest. It has been featured in hundreds of magazine articles, several movies, dozens of books, and one comic book series. Visit the press page for further details.Popular: Thousands of people download Nmap every day, and it is included with many operating systems (Redhat Linux, Debian Linux, Gentoo, FreeBSD, OpenBSD, etc). It is among the top ten (out of 30,000) programs at the Freshmeat.Net repository. This is important because it lends Nmap its vibrant development and user support communities.ChangelogHere is the full list of significant changes:• [Windows] We made a ton of improvements to our Npcap Windows packetcapturing library (https://nmap.org/npcap/) for greater performance andstability, as well as smoother installer and better 802.11 raw framecapturing support. Nmap 7.70 updates the bundled Npcap from version 0.93 to0.99-r2, including all these changes from the last seven Npcap releases:https://nmap.org/npcap/changelog• Integrated all of your service/version detection fingerprints submittedfrom March 2017 to August 2017 (728 of them). The signature count went up1.02% to 11,672, including 26 new softmatches. We now detect 1224protocols from filenet-pch, lscp, and netassistant to sharp-remote,urbackup, and watchguard. We will try to integrate the remainingsubmissions in the next release.• Integrated all of your IPv4 OS fingerprint submissions from September2016 to August 2017 (667 of them). Added 298 fingerprints, bringing the newtotal to 5,652. Additions include iOS 11, macOS Sierra, Linux 4.14, Android7, and more.• Integrated all 33 of your IPv6 OS fingerprint submissions from September2016 to August 2017. New groups for OpenBSD 6.0 and FreeBSD 11.0 wereadded, as well as strengthened groups for Linux and OS X.• Added the –resolve-all option to resolve and scan all IP addresses of ahost. This essentially replaces the resolveall NSE script. [Daniel Miller]• [NSE][SECURITY] Nmap developer nnposter found a security flaw (directorytraversal vulnerability) in the way the non-default http-fetch scriptsanitized URLs. If a user manualy ran this NSE script against a maliciousweb server, the server could potentially (depending on NSE arguments used)cause files to be saved outside the intended destination directory.Existing files couldn’t be overwritten. We fixed http-fetch, audited ourother scripts to ensure they didn’t make this mistake, and updated thehttpspider library API to protect against this by default. [nnposter,Daniel Miller]• [NSE] Added 9 NSE scripts, from 8 authors, bringing the total up to 588!They are all listed at https://nmap.org/nsedoc/, and the summaries arebelow: – deluge-rpc-brute performs brute-force credential testing against Deluge BitTorrent RPC services, using the new zlib library. [Claudiu Perta] – hostmap-crtsh lists subdomains by querying Google’s Certificate Transparency logs. [Paulino Calderon] – [GH#892] http-bigip-cookie decodes unencrypted F5 BIG-IP cookies and reports back the IP address and port of the actual server behind the load-balancer. [Seth Jackson] – http-jsonp-detection Attempts to discover JSONP endpoints in web servers. JSONP endpoints can be used to bypass Same-origin Policy restrictions in web browsers. [Vinamra Bhatia] – http-trane-info obtains information from Trane Tracer SC controllers and connected HVAC devices. [Pedro Joaquin] – [GH#609] nbd-info uses the new nbd.lua library to query Network Block Devices for protocol and file export information. [Mak Kolybabi] – rsa-vuln-roca checks for RSA keys generated by Infineon TPMs vulnerable to Return Of Coppersmith Attack (ROCA) (CVE-2017-15361). Checks SSH and TLS services. [Daniel Miller] – [GH#987] smb-enum-services retrieves the list of services running on a remote Windows machine. Modern Windows systems requires a privileged domain account in order to list the services. [Rewanth Cool] – tls-alpn checks TLS servers for Application Layer Protocol Negotiation (ALPN) support and reports supported protocols. ALPN largely replaces NPN, which tls-nextprotoneg was written for. [Daniel Miller]• [GH#978] Fixed Nsock on Windows giving errors when selecting on STDIN.This was causing Ncat 7.60 in connect mode to quit with error: libnsockselect_loop(): nsock_loop error 10038: An operation was attempted onsomething that is not a socket. [nnposter]• [Ncat][GH#197][GH#1049] Fix –ssl connections from dropping onrenegotiation, the same issue that was partially fixed for server mode in[GH#773]. Reported on Windows with -e by pkreuzt and vinod272. [DanielMiller]• [NSE][GH#1062][GH#1149] Some changes to brute.lua to better handlemisbehaving or rate-limiting services. Most significantly,brute.killstagnated now defaults to true. Thanks to xp3s and Adamtimtim forreporing infinite loops and proposing changes.• [NSE] VNC scripts now support Apple Remote Desktop authentication (authtype 30) [Daniel Miller]• [NSE][GH#1111] Fix a script crash in ftp.lua when PASV connection timedout. [Aniket Pandey]• [NSE][GH#1114] Update bitcoin-getaddr to receive more than one responsemessage, since the first message usually only has one address in it. [h43z]• [Ncat][GH#1139] Ncat now selects the correct default port for a givenproxy type. [Pavel Zhukov]• [NSE] memcached-info can now gather information from the UDP memcachedservice in addition to the TCP service. The UDP service is frequently usedas a DDoS reflector and amplifier. [Daniel Miller]• [NSE][GH#1129] Changed url.absolute() behavior with respect to dot anddot-dot path segments to comply with RFC 3986, section 5.2. [nnposter]• Removed deprecated and undocumented aliases for several long options thatused underscores instead of hyphens, such as –max_retries. [Daniel Miller]• Improved service scan’s treatment of soft matches in two ways. First ofall, any probes that could result in a full match with the soft matchedservice will now be sent, regardless of rarity. This improves the chancesof matching unusual services on non-standard ports. Second, probes are nowskipped if they don’t contain any signatures for the soft matched service.Perviously the probes would still be run as long as the target port numbermatched the probe’s specification. Together, these changes should makeservice/version detection faster and more accurate. For more details onhow it works, see https://nmap.org/book/vscan.html. [Daniel Miller]• –version-all now turns off the soft match optimization, ensuring thatall probes really are sent, even if there aren’t any existing match linesfor the softmatched service. This is slower, but gives the mostcomprehensive results and produces better fingerprints for submission.[Daniel Miller]• [NSE][GH#1083] New set of Telnet softmatches for version detection basedon Telnet DO/DON’T options offered, covering a wide variety of devices andoperating systems. [D Roberson]• [GH#1112] Resolved crash opportunities caused by unexpected libpcapversion string format. [Gisle Vanem, nnposter]• [NSE][GH#1090] Fix false positives in rexec-brute by checking responsesfor indications of login failure. [Daniel Miller]• [NSE][GH#1099] Fix http-fetch to keep downloaded files in separatedestination directories. [Aniket Pandey]• [NSE] Added new fingerprints to http-default-accounts:+ Hikvision DS-XXX Network Camera and NUOO DVR [Paulino Calderon]+ [GH#1074] ActiveMQ, Purestorage, and Axis Network Cameras [RobFitzpatrick, Paulino Calderon]• Added a new service detection match for WatchGuard AuthenticationGateway. [Paulino Calderon]• [NSE][GH#1038][GH#1037] Script qscan was not observing interpacket delays(parameter qscan.delay). [nnposter]• [NSE][GH#1046] Script http-headers now fails properly if the target doesnot return a valid HTTP response. [spacewander]• [Ncat][Nsock][GH#972] Remove RC4 from the list of TLS ciphers used bydefault, in accordance with RFC 7465. [Codarren Velvindron]• [NSE][GH#1022] Fix a false positive condition in ipmi-cipher-zero causedby not checking the error code in responses. Implementations which returnan error are not vulnerable. [Juho Jokelainen]• [NSE][GH#958] Two new libraries for NSE. – idna – Support for internationalized domain names in applications (IDNA) – punycode (a transfer encoding syntax used in IDNA) [Rewanth Cool]• [NSE] New fingerprints for http-enum: – [GH#954] Telerik UI CVE-2017-9248 [Harrison Neal] – [GH#767] Many WordPress version detections [Rewanth Cool]• [GH#981][GH#984][GH#996][GH#975] Fixed Ncat proxy authentication issues[nnposter]: – Usernames and/or passwords could not be empty – Passwords could not contain colons – SOCKS5 authentication was not properly documented – SOCKS5 authentication had a memory leak• [GH#1009][GH#1013] Fixes to autoconf header files to allow autoreconf tobe run. [Lukas Schwaighofer]• [GH#977] Improved DNS service version detection coverage and consistencyby using data from a Project Sonar Internet wide survey. Numerouse falsepositives were removed and reliable softmatches added. Match lines forversion.bind responses were also conslidated using the technique below.[Tom Sellers]• [GH#977] Changed version probe fallbacks so as to work cross protocol(TCP/UDP). This enables consolidating match lines for services where theresponses on TCP and UDP are similar. [Tom Sellers]• [NSE][GH#532] Added the zlib library for NSE so scripts can easily handlecompression. This work started during GSOC 2014, so we’re particularlypleased to finally integrate it! [Claudiu Perta, Daniel Miller]• [NSE][GH#1004] Fixed handling of brute.retries variable. It was beingtreated as the number of tries, not retries, and a value of 0 would resultin infinite retries. Instead, it is now the number of retries, defaultingto 2 (3 total tries), with no option for infinite retries.• [NSE] http-devframework-fingerprints.lua supports Jenkins serverdetection and returns extra information when Jenkins is detected [VinamraBhatia]• [GH#926] The rarity level of MS SQL’s service detection probe wasdecreased. Now we can find MS SQL in odd ports without increasing versionintensity. [Paulino Calderon]• [GH#957] Fix reporting of zlib and libssh2 versions in "nmap –version".We were always reporting the version number of the included source, evenwhen a different version was actually linked. [Pavel Zhukov]• Add a new helper function for nmap-service-probes match lines: $I(1,">")will unpack an unsigned big-endian integer value up to 8 bytes wide fromcapture 1. The second option can be "<" for little-endian. [Daniel Miller]Download Nmap 7.70

Link: http://feedproxy.google.com/~r/PentestTools/~3/8CNBI50qetc/nmap-770-free-security-scanner-better.html

WPHunter – WordPress Vulnerability Scanner

You can use this tool on your wordpress website to check the security of your website by finding the vulnerability in your website.Over 75 million websites run on WordPress. which is now powers 26% of the Web. Remarkably enough thousands of WP sites are vulnerable to attacks and get hacked each day. You can lose all your data, it can cost thousands of dollars, or worse, attackers might use your WordPress to target your visitors. Bots scan the web automatically for weak websites and hack into them within seconds. If your WordPress is vulnerable, it will be only a matter of time before you run into trouble. That’s why you should get started as soon as possible and check if your WordPress is prone to attack.[+] Auto Cms Detect[1] WordPress :The tool detects the wordpress version and try to find the vulnerabilities that are vulnerable on the version,the tools detects also the the plugins and themes installed on the website.WPHunter can aslo find the backup files, path disclosure, and checks security headers.Usage Short Form Long Form Description -h –help usage of the tool Exampleif you have list websites run tool with this command lineif you don’t have list websites run the tool with this commandphp wphunter.php https://www.example.comInstallation Linuxgit clone https://github.com/Jamalc0m/wphunter/wphunter.gitcd WPHunterphp wphunter.phpInstallation WindowsDownload and install PHPDownload WPHunterExtract WPHunter into DesktopOpen CMD and type the following commands:cd Desktop/wphunter-master/php wphunter.phpVersionCurrent version is 0.1 Beta UpComing features: Scan for plugins and theme vulnerabilities, generate reports (PDF,HTML), Passowrd Brute Force.Download Wphunter

Link: http://feedproxy.google.com/~r/PentestTools/~3/qzcpMRK-3hQ/wphunter-wordpress-vulnerability-scanner.html