GIF89a=( �' 7IAXKgNgYvYx\%wh&h}t�h%�s%x�}9�R��&�0%� (�.��5�SD��&�a)�x5��;ͣ*ȡ&ղ)ׯ7׵<ѻ4�3�H֧KͯT��Y�aq��q��F� !� ' !� NETSCAPE2.0 , =( ��pH,�Ȥr�l:xШtJ�Z�جv��z��xL.:��z�n���|N�����~�������& !�0`9R�}��"�"a:S�~x��������g���E�������R���E����B�� ��ȸ��D���"�Ů� �H��L��D٫D�B�����D���T���H �G��A R�ڐ |�� ٭&��E8�S�kG�A�px�a��� R2XB��E8I���6X�:vT)�~��q�賥��"F~%x� � 4#Z�0O|-4Bs�X:= Q� Sal��yXJ`GȦ|s h��K3l7�B|�$'7Jީܪ0!��D�n=�P� ����0`�R�lj����v>���5 �.69�ϸd�����nlv�9��f{���Pbx �l5}�p� ��� �3a���I�O����!ܾ���i��9��#��)p�a ޽ �{�)vm��%D~ 6f��s}Œ�D�W E�`!� �&L8x� �ܝ{)x`X/>�}m��R�*|`D�=�_ ^�5 !_&'a�O�7�c��`DCx`�¥�9�Y�F���`?��"� �n@`�} lď��@4>�d S �v�xN��"@~d��=�g�s~G��� ���ud &p8Q�)ƫlXD����A~H�ySun�j���k*D�LH�] ��C"J��Xb~ʪwSt}6K,��q�S:9ت:���l�@�`�� �.۬�t9�S�[:��=`9N����{¿�A !R�:���6��x�0�_ �;������^���#����!����U���;0L1�����p% A��U̬ݵ��%�S��!���~`�G���� ���=4�np�3���������u�u�ٮ|%2�I��r�#0��J``8�@S@5� ���^`8E�]�.�S���7 � �0�j S�D� z���i�S�����!���l��w9*�D�I�nEX��� &A�Go�Qf��F��;���}�J����F5��Q|���X��T��y���]� o ��C=��:���PB@ D׽S�(>�C�x}`��xJЬ�۠��p+eE0`�}`A �/NE�� �9@��� H�7�!%B0`�l*��!8 2�%� �:�1�0E��ux%nP1�!�C)�P81l�ɸF#Ƭ{����B0>�� �b�`��O3��()yRpb��E.ZD8�H@% �Rx+%���c� ���f��b�d�`F�"8�XH"��-�|1�6iI, 2�$+](A*j� QT�o0.�U�`�R�}`�SN����yae�����b��o~ S)�y�@��3 �tT�0�&�+~L�f"�-|�~��>!�v��~�\Q1)}@�}h#aP72�"�$ !� " , =( &7IAXG]KgNgYvYxR"k\%w]'}h}t�h%�g+�s%r.m3ax3�x�}9��&��+�!7�0%� (�.�SD��&��;�"&ײ)׻4��6�K� �@pH,�Ȥr�l:xШtJ�Z�جv��z��xL.:��z�n���|N�����~�������& !�0`9R�}��"�"a:S�~x��������g �� E �� �������E �´��C���ǶR��D��"Ʒ�ʱH��M��GڬD�B����D��T����G���C�C� l&�~:'�tU�6ɹ#��)�'�.6�&��Ȼ K(8p0N�?!�2"��NIJX>R��OM '��2�*x�>#n� �@<[:�I�f ��T���Cdb��[�}E�5MBo��@�`@��tW-3 �x�B���jI�&E�9[T&$��ﯧ&"s��ȳ����dc�UUρ#���ldj?����`\}���u|3'�R]�6 �S#�!�FKL�*N E���`$�:e�YD�q�.�촁�s \-�jA 9�����-��M[�x(�s��x�|���p��}k�T�DpE@W� ��]k`1� ���Yb ��0l��*n0��"~zBd�~u�7�0Bl��0-�x~|U�U0 �h�*HS�|��e"#"?vp�i`e6^�+q��`m8 #V�� ��VS|`��"m"сSn|@:U���~`pb�G�ED����2F�I�? >�x� R� ��%~jx��<�a�9ij�2�D��&: Z`�]w���:�6��B�7eFJ|�ҧ�,���FǮcS�ʶ+B�,�ܺN���>PAD�HD��~���n��}�#�� Q��S���2�X�{�k�lQ�2�����w�|2� h9��G�,m���3��6-��E�L��I�³*K���q�`DwV�QXS��peS��� qܧTS����R�u �<�a�*At�lmE� � ��N[P1�ۦ��$��@`��Dpy�yXvCAy�B`}D� 0QwG#� �a[^�� $���Ǧ{L�"[��K�g�;�S~��GX.�goT.��ư��x���?1z��x~:�g�|�L� ��S`��0S]P�^p F<""�?!,�!N4&P� ����:T�@h�9%t��:�-~�I<`�9p I&.)^ 40D#p@�j4�ج:�01��rܼF2oW�#Z ;$Q q  �K��Nl#29 !F@�Bh�ᏬL!XF�LHKh�.�hE&J�G��<"WN!�����Y@� >R~19J"�2,/ &.GXB%�R�9B6�W]���W�I�$��9�RE8Y� ��"�A5�Q.axB�&ة�J�! �t)K%tS-�JF b�NMxL��)�R��"���6O!TH�H� 0 !� ) , =( &AXKgNgYvYxR"k\%wh&h}h%�g+�s%r.x3�x�}9��&��+�R,�!7�0%� (�.��5��&�a)��;�"&ף*Ȳ)ׯ7׻4�3��6�H֧KͻH�T��Y��q��h� ��pH,�Ȥr�l:xШtJ�Z�جv��z��xL.:��z�n���|N�����~�������& !�0`9R�}��"�"a:S�~x��������g �� E$����� � ����$E$��"��D� � ������R��C��� E ��H�M��G�D� �B��ϾD��a��`1r��Ӑ�� �o~�zU!L�C'�yW�UGt����ll�0���uG�)A�s[��x� �xO%��X2�  P�n:R/��aHae+�Dm?# ǣ6�8�J�x�Di�M���j���5oQ7�- <! *�l��R2r/a!l)d� A"�E���� &� ;��c �%����b��pe~C"B���H�eF2��`8qb�t_`ur`e� w�u3��Pv�h""�`�Íx�LĹ��3� �~ֺ�:���MDfJ� �۵�W�%�S�X �؁)�@��:E��w�u�Sxb8y\m�zS��Zb�E�L��w!y(>�"w�=�|��s�d �C�W)H�cC$�L �7r.�\{)@�`@ �X�$PD `aaG:���O�72E�amn]�"Rc�x�R� &dR8`g��i�xLR!�P &d����T���i�|�_ � Qi�#�`g:��:noM� :V �)p����W&a=�e�k� j���1߲s�x�W�jal|0��B0�, \j۴:6���C ��W��|��9���zĸV {�;��n��V�m�I��.��PN� ����C��+��By�ѾHŸ:��� 7�Y�FTk�SaoaY$D�S���29R�kt� ��f� ��:��Sp�3�I��DZ� �9���g��u�*3)O��[_hv ,���Et x�BH� �[��64M@�S�M7d�l�ܶ5-��U܍��z�R3Ԭ3~ ��P��5�g: ���kN�&0�j4���#{��3S�2�K�'ợl���2K{� {۶?~m𸧠�I�nE�='����^���_�=��~�#O���'���o..�Y�n��CSO��a��K��o,���b�����{�C�� "�{�K ��w��Ozdը�:$ ���v�] A#� ���a�z)Rx׿ƥ�d``�w-�y�f�K!����|��P��=�`�(f��'Pa ��BJa%��f�%`�}F����6>��`G"�}�=�!o`�^FP�ةQ�C���`(�}\�ݮ ��$<��n@dĠE#��U�I�!� #l��9`k���'Rr��Z�NB�MF �[�+9���-�wj���8�r� ,V�h"�|�S=�G_��"E� 0i*%̲��da0mVk�):;&6p>�jK ��# �D�:�c?:R Ӭf��I-�"�<�="��7�3S��c2RW ,�8(T"P0F¡Jh�" ; 403WebShell
403Webshell
Server IP : 173.249.157.85  /  Your IP : 18.222.153.154
Web Server : Apache
System : Linux server.frogzhost.com 3.10.0-1127.19.1.el7.x86_64 #1 SMP Tue Aug 25 17:23:54 UTC 2020 x86_64
User : econtech ( 1005)
PHP Version : 7.3.33
Disable Function : NONE
MySQL : OFF  |  cURL : OFF  |  WGET : ON  |  Perl : ON  |  Python : ON  |  Sudo : ON  |  Pkexec : ON
Directory :  /lib64/python2.7/

Upload File :
current_dir [ Writeable ] document_root [ Writeable ]

 

Command :


[ Back ]     

Current File : /lib64/python2.7/robotparser.py
""" robotparser.py

    Copyright (C) 2000  Bastian Kleineidam

    You can choose between two licenses when using this package:
    1) GNU GPLv2
    2) PSF license for Python 2.2

    The robots.txt Exclusion Protocol is implemented as specified in
    http://info.webcrawler.com/mak/projects/robots/norobots-rfc.html
"""
import urlparse
import urllib

__all__ = ["RobotFileParser"]


class RobotFileParser:
    """ This class provides a set of methods to read, parse and answer
    questions about a single robots.txt file.

    """

    def __init__(self, url=''):
        self.entries = []
        self.default_entry = None
        self.disallow_all = False
        self.allow_all = False
        self.set_url(url)
        self.last_checked = 0

    def mtime(self):
        """Returns the time the robots.txt file was last fetched.

        This is useful for long-running web spiders that need to
        check for new robots.txt files periodically.

        """
        return self.last_checked

    def modified(self):
        """Sets the time the robots.txt file was last fetched to the
        current time.

        """
        import time
        self.last_checked = time.time()

    def set_url(self, url):
        """Sets the URL referring to a robots.txt file."""
        self.url = url
        self.host, self.path = urlparse.urlparse(url)[1:3]

    def read(self):
        """Reads the robots.txt URL and feeds it to the parser."""
        opener = URLopener()
        f = opener.open(self.url)
        lines = [line.strip() for line in f]
        f.close()
        self.errcode = opener.errcode
        if self.errcode in (401, 403):
            self.disallow_all = True
        elif self.errcode >= 400:
            self.allow_all = True
        elif self.errcode == 200 and lines:
            self.parse(lines)

    def _add_entry(self, entry):
        if "*" in entry.useragents:
            # the default entry is considered last
            if self.default_entry is None:
                # the first default entry wins
                self.default_entry = entry
        else:
            self.entries.append(entry)

    def parse(self, lines):
        """parse the input lines from a robots.txt file.
           We allow that a user-agent: line is not preceded by
           one or more blank lines."""
        # states:
        #   0: start state
        #   1: saw user-agent line
        #   2: saw an allow or disallow line
        state = 0
        linenumber = 0
        entry = Entry()

        for line in lines:
            linenumber += 1
            if not line:
                if state == 1:
                    entry = Entry()
                    state = 0
                elif state == 2:
                    self._add_entry(entry)
                    entry = Entry()
                    state = 0
            # remove optional comment and strip line
            i = line.find('#')
            if i >= 0:
                line = line[:i]
            line = line.strip()
            if not line:
                continue
            line = line.split(':', 1)
            if len(line) == 2:
                line[0] = line[0].strip().lower()
                line[1] = urllib.unquote(line[1].strip())
                if line[0] == "user-agent":
                    if state == 2:
                        self._add_entry(entry)
                        entry = Entry()
                    entry.useragents.append(line[1])
                    state = 1
                elif line[0] == "disallow":
                    if state != 0:
                        entry.rulelines.append(RuleLine(line[1], False))
                        state = 2
                elif line[0] == "allow":
                    if state != 0:
                        entry.rulelines.append(RuleLine(line[1], True))
                        state = 2
        if state == 2:
            self._add_entry(entry)


    def can_fetch(self, useragent, url):
        """using the parsed robots.txt decide if useragent can fetch url"""
        if self.disallow_all:
            return False
        if self.allow_all:
            return True
        # search for given user agent matches
        # the first match counts
        parsed_url = urlparse.urlparse(urllib.unquote(url))
        url = urlparse.urlunparse(('', '', parsed_url.path,
            parsed_url.params, parsed_url.query, parsed_url.fragment))
        url = urllib.quote(url)
        if not url:
            url = "/"
        for entry in self.entries:
            if entry.applies_to(useragent):
                return entry.allowance(url)
        # try the default entry last
        if self.default_entry:
            return self.default_entry.allowance(url)
        # agent not found ==> access granted
        return True


    def __str__(self):
        return ''.join([str(entry) + "\n" for entry in self.entries])


class RuleLine:
    """A rule line is a single "Allow:" (allowance==True) or "Disallow:"
       (allowance==False) followed by a path."""
    def __init__(self, path, allowance):
        if path == '' and not allowance:
            # an empty value means allow all
            allowance = True
        self.path = urllib.quote(path)
        self.allowance = allowance

    def applies_to(self, filename):
        return self.path == "*" or filename.startswith(self.path)

    def __str__(self):
        return (self.allowance and "Allow" or "Disallow") + ": " + self.path


class Entry:
    """An entry has one or more user-agents and zero or more rulelines"""
    def __init__(self):
        self.useragents = []
        self.rulelines = []

    def __str__(self):
        ret = []
        for agent in self.useragents:
            ret.extend(["User-agent: ", agent, "\n"])
        for line in self.rulelines:
            ret.extend([str(line), "\n"])
        return ''.join(ret)

    def applies_to(self, useragent):
        """check if this entry applies to the specified agent"""
        # split the name token and make it lower case
        useragent = useragent.split("/")[0].lower()
        for agent in self.useragents:
            if agent == '*':
                # we have the catch-all agent
                return True
            agent = agent.lower()
            if agent in useragent:
                return True
        return False

    def allowance(self, filename):
        """Preconditions:
        - our agent applies to this entry
        - filename is URL decoded"""
        for line in self.rulelines:
            if line.applies_to(filename):
                return line.allowance
        return True

class URLopener(urllib.FancyURLopener):
    def __init__(self, *args):
        urllib.FancyURLopener.__init__(self, *args)
        self.errcode = 200

    def prompt_user_passwd(self, host, realm):
        ## If robots.txt file is accessible only with a password,
        ## we act as if the file wasn't there.
        return None, None

    def http_error_default(self, url, fp, errcode, errmsg, headers):
        self.errcode = errcode
        return urllib.FancyURLopener.http_error_default(self, url, fp, errcode,
                                                        errmsg, headers)

Youez - 2016 - github.com/yon3zu
LinuXploit