Sindbad~EG File Manager

Current Path : /usr/local/lib/python3.12/site-packages/pandas/io/json/__pycache__/
Upload File :
Current File : //usr/local/lib/python3.12/site-packages/pandas/io/json/__pycache__/_json.cpython-312.pyc

�

Mٜg�����ddlmZddlmZmZddlmZddlmZddlm	Z	ddl
mZmZm
Z
mZmZmZmZmZddlZddlZddlmZdd	lmZmZdd
lmZddlmZddlm Z dd
l!m"Z"ddl#m$Z$ddl%m&Z&ddl'm(Z(m)Z)ddl*m+Z+ddl,m-Z-m.Z.m/Z/m0Z0m1Z1m2Z2m3Z3m4Z4ddl5m6Z6ddl7m8Z8ddl9m:Z:m;Z;m<Z<m=Z=m>Z>m?Z?m@Z@mAZAmBZBddlCmDZDddlEmFZFmGZGddlHmIZIer.ddlJmKZKmLZLddlMmNZNddlOmPZPmQZQmRZRmSZSmTZTmUZUmVZVmWZWmXZXmYZYmZZZddl[m\Z\eded��Z]e												dG																													dHd#��Z^e												dG																													dId$��Z^												dJ																													dKd(�Z^Gd)�d*e�Z_Gd+�d,e_�Z`Gd-�d.e_�ZaGd/�d0ea�Zbed d d d d d d d d d d d d d d d d1�																																					dLd2��Zced d d d d d d d d d d d d d d d3�																																					dMd4��Zced d d d d d d d d d d d d d d d d5�																																					dNd6��Zced d d d d d d d d d d d d d d d d d7�																																					dOd8��Zce"e8d"e8d9d!z�:�dd;ddd%d%d&ddd<d&dd'ddej�d=d7�																																					dPd>��ZcGd?�d@ej�ee]�ZfGdA�dB�ZgGdC�dDeg�ZhGdE�dFeg�Ziy)Q�)�annotations)�ABC�abstractmethod)�abc)�StringIO)�islice)�
TYPE_CHECKING�Any�Callable�Generic�Literal�TypeVar�final�overloadN)�lib)�ujson_dumps�ujson_loads)�iNaT)�import_optional_dependency��AbstractMethodError)�doc)�find_stack_level)�check_dtype_backend)�
ensure_str�is_string_dtype)�PeriodDtype)�
ArrowDtype�	DataFrame�Index�
MultiIndex�Series�isna�notna�to_datetime)�concat)�_shared_docs)	�	IOHandles�dedup_names�extension_to_compression�file_exists�
get_handle�
is_fsspec_url�is_potential_multi_index�is_url�stringify_path)�convert_to_line_delimits)�build_table_schema�parse_table_schema)�validate_integer)�Hashable�Mapping)�
TracebackType)�CompressionOptions�DtypeArg�DtypeBackend�FilePath�
IndexLabel�
JSONEngine�JSONSerializable�
ReadBuffer�Self�StorageOptions�WriteBuffer)�NDFrame�FrameSeriesStrT)�frame�series)�bound.�path_or_buf�storage_optionsc��y�N��rH�obj�orient�date_format�double_precision�force_ascii�	date_unit�default_handler�lines�compression�index�indentrI�modes              �?/usr/local/lib/python3.12/site-packages/pandas/io/json/_json.py�to_jsonr[d���"�c��yrKrLrMs              rZr[r[xr\r]TF�inferc���|dvr|
durtd��|dvr|
durtd��|
�d}
|r|dk7rtd��|
d	vrd
|
�d�}t|��|
dk(r|r|dk7r
d
}t|��|dk(r0t|t�r |j|jxsd��}|dk(rt|t
�rt}n9t|t�rt}n"t|t
�rt}ntd��|||||||||
|��	j�}|rt|�}|�4t||
|	|��5}|jj|�ddd�y|S#1swYyxYw)N)�records�valuesTzT'index=True' is only valid when 'orient' is 'split', 'table', 'index', or 'columns'.�rW�columnsFzV'index=False' is only valid when 'orient' is 'split', 'table', 'records', or 'values'.raz3'lines' keyword only valid when 'orient' is records)�a�wzmode=z@ is not a valid option.Only 'w' and 'a' are currently supported.rezNmode='a' (append) is only supported when lines is True and orient is 'records'�tablerb)�namez''obj' should be a Series or a DataFrame)rOrPrQ�ensure_asciirSrTrWrX)rVrI)�
ValueError�
isinstancer"�to_framerhr�JSONTableWriter�SeriesWriter�FrameWriter�NotImplementedError�writer1r,�handle)rHrNrOrPrQrRrSrTrUrVrWrXrIrY�msg�writer�s�handless                  rZr[r[�s��� �&�&�5�D�=��
%�
�	
�
�'�	'�E�U�N��
&�
�	
�
������9�$��N�O�O��:���D�6�8�
8�	���o���s�{�E�V�y�%8�
4�	���o��
���Z��V�4��l�l���� 4�H�l�5�����Z��Y�7� ��	�C��	 ���	�C��	#���!�"K�L�L�����)� ��'���
	��e�g��
�$�Q�'����
���;��
�
��N�N� � ��#�
����
��s�2E�E"c�x�eZdZUded<		d																			d	d�Zd
d�Zdd�Zeedd���Z	y)
�Writer�str�_default_orientNc
���||_|�|j}||_||_||_||_||_||_||_|	|_	d|_
|j�yrK)rNrzrOrPrQrirSrTrWrX�is_copy�_format_axes)
�selfrNrOrPrQrirSrWrTrXs
          rZ�__init__zWriter.__init__�sj������>��)�)�F����&��� 0���(���"���.�����
����������r]c��t|��rKr�r~s rZr}zWriter._format_axes�
��!�$�'�'r]c
���|jdk(}t|j|j|j|j
|j||j|j��S)N�iso)rOrQrirS�	iso_datesrTrX)	rPr�obj_to_writerOrQrirSrTrX)r~r�s  rZrqzWriter.writes[���$�$��-�	������;�;�!�2�2��*�*��n�n�� �0�0��;�;�	
�		
r]c��y)zObject to write in JSON format.NrLr�s rZr�zWriter.obj_to_writes�r]�Nr)rNrCrO�
str | NonerPryrQ�intri�boolrSryrWr�rT�(Callable[[Any], JSONSerializable] | NonerXr��return�None�r�r��r�ry�r�z"NDFrame | Mapping[IndexLabel, Any])
�__name__�
__module__�__qualname__�__annotations__rr}rq�propertyrr�rLr]rZrxrx�s�����EI���
�����	�
���
�����B����
��:(�
���.���.r]rxc�*�eZdZdZedd��Zdd�Zy)rnrWc��|js<|jdk(r-|jj|jjd�S|jS)N�split)rh�data)rWrOrNrhrbr�s rZr�zSeriesWriter.obj_to_writes8���z�z�d�k�k�W�4� �H�H�M�M�4�8�8�?�?�C�C��8�8�Or]c��|jjjs)|jdk(rt	d|j�d���yy)NrWz(Series index must be unique for orient='�')rNrW�	is_uniquerOrjr�s rZr}zSeriesWriter._format_axes"sA���x�x�~�~�'�'�D�K�K�7�,B��G����}�TU�V�W�W�-C�'r]Nr�r��r�r�r�rzr�r�r}rLr]rZrnrns���O�
����Xr]rnc�*�eZdZdZedd��Zdd�Zy)rordc��|js0|jdk(r!|jjd��}|d=|S|j}|S)Nr�)rOrW)rWrOrN�to_dict)r~r�s  rZr�zFrameWriter.obj_to_write*sL���z�z�d�k�k�W�4��8�8�+�+�7�+�;�L��W�%��� �8�8�L��r]c�"�|jjjs'|jdvrt	d|j�d���|jj
js(|jdvrt	d|j�d���yy)z:
        Try to format axes if they are datelike.
        rcz+DataFrame index must be unique for orient='z'.)rWrdraz-DataFrame columns must be unique for orient='N)rNrWr�rOrjrdr�s rZr}zFrameWriter._format_axes3s����x�x�~�~�'�'�D�K�K�;O�,O��=�d�k�k�]�"�M��
��x�x���)�)�d�k�k�>
�/
�
�?����}�B�O��
�/
�)r]Nr�r�r�rLr]rZroro's���O�
����r]roc�^��eZdZdZ		d																	d�fd�
Zedd��Z�xZS)rmrac
����t�|�|||||||||	��	|dk7rd|�d�}
t|
��t||j��|_|jdk(r%t|jt�rtd��|jdk(r+|jt|jj�vs8t|jj|jj��r
d	}
t|
��|j!�}|j#d
g��j}t|�r||j%d��||<t|jj&t(�r|jj+�|_|js|j-d
��|_n|j-d��|_d|_d|_||_y)z�
        Adds a `schema` attribute with the Table Schema, resets
        the index (can't do in caller, because the schema inference needs
        to know what the index is, forces orient to records, and forces
        date_format to 'iso'.
        )rTrXr�z8Trying to write with `orient='table'` and `date_format='zH'`. Table Schema requires dates to be formatted with `date_format='iso'`�rW�z6orient='table' is not supported for MultiIndex columns�z/Overlapping names between the index and columns�	timedelta)�includec�"�|j�SrK)�	isoformat)�xs rZ�<lambda>z*JSONTableWriter.__init__.<locals>.<lambda>�s
��A�K�K�Mr]T)�dropFraN)�superrrjr2rW�schema�ndimrkrdr!rprh�set�names�len�intersection�copy�
select_dtypes�map�dtyper�to_timestamp�reset_indexrNrPrO)
r~rNrOrPrQrirSrWrTrXrs�
timedeltas�	__class__s
            �rZrzJSONTableWriter.__init__Hs����$	����������+��	�
	
��%��!�!,�
�.;�;�
�
�S�/�!�(��D�J�J�?����8�8�q�=�Z����Z�@�%�H��
��X�X��]����S������1�1��3�;�;�+�+�C�I�I�O�O�<�=�C�C��S�/�!��h�h�j���&�&��}�&�=�E�E�
��z�?�!�*�o�1�1�2I�J�C�
�O��c�i�i�o�o�{�3��	�	�.�.�0�C�I��z�z����D��1�D�H����E��2�D�H� ��������
r]c�4�|j|jd�S)N)r�r�)r�rNr�s rZr�zJSONTableWriter.obj_to_write�s���+�+�t�x�x�8�8r]r�)rOr�rPryrQr�rir�rSryrWr�rTr�rXr�r�r�r�)r�r�r�rzrr�r��
__classcell__)r�s@rZrmrmEs�����O�EI��F��F��	F�
�F��
F��F��F�B�F��F�
�F�P�9��9r]rm)rO�typr��convert_axes�
convert_dates�keep_default_dates�
precise_floatrS�encoding�encoding_errorsrUrV�nrowsrI�
dtype_backend�enginec��yrKrL�rHrOr�r�r�r�r�r�rSr�r�rU�	chunksizerVr�rIr�r�s                  rZ�	read_jsonr�����,r])rOr�r�r�r�r�rSr�r�rUrVr�rIr�r�c��yrKrLr�s                  rZr�r��r�r])rOr�r�r�r�r�rSr�r�rUr�rVr�rIr�r�c��yrKrLr�s                  rZr�r��r�r])rOr�r�r�r�r�r�rSr�r�rUr�rVr�rIr�r�c��yrKrLr�s                  rZr�r��r�r]�decompression_options)rIr�rE�strict�ujsonc�&�|dk(r
|rtd��|dk(r
|rtd��t|�|�|dk7rd}|�|dk7rd}t|fid|�d|�d|�d|�d	|�d
|�d|�d|�d
|	�d|�d|�d|
�d|�d|�d|
�d|�d|��}|r|S|j�S)a|"
    Convert a JSON string to pandas object.

    Parameters
    ----------
    path_or_buf : a valid JSON str, path object or file-like object
        Any valid string path is acceptable. The string could be a URL. Valid
        URL schemes include http, ftp, s3, and file. For file URLs, a host is
        expected. A local file could be:
        ``file://localhost/path/to/table.json``.

        If you want to pass in a path object, pandas accepts any
        ``os.PathLike``.

        By file-like object, we refer to objects with a ``read()`` method,
        such as a file handle (e.g. via builtin ``open`` function)
        or ``StringIO``.

        .. deprecated:: 2.1.0
            Passing json literal strings is deprecated.

    orient : str, optional
        Indication of expected JSON string format.
        Compatible JSON strings can be produced by ``to_json()`` with a
        corresponding orient value.
        The set of possible orients is:

        - ``'split'`` : dict like
          ``{{index -> [index], columns -> [columns], data -> [values]}}``
        - ``'records'`` : list like
          ``[{{column -> value}}, ... , {{column -> value}}]``
        - ``'index'`` : dict like ``{{index -> {{column -> value}}}}``
        - ``'columns'`` : dict like ``{{column -> {{index -> value}}}}``
        - ``'values'`` : just the values array
        - ``'table'`` : dict like ``{{'schema': {{schema}}, 'data': {{data}}}}``

        The allowed and default values depend on the value
        of the `typ` parameter.

        * when ``typ == 'series'``,

          - allowed orients are ``{{'split','records','index'}}``
          - default is ``'index'``
          - The Series index must be unique for orient ``'index'``.

        * when ``typ == 'frame'``,

          - allowed orients are ``{{'split','records','index',
            'columns','values', 'table'}}``
          - default is ``'columns'``
          - The DataFrame index must be unique for orients ``'index'`` and
            ``'columns'``.
          - The DataFrame columns must be unique for orients ``'index'``,
            ``'columns'``, and ``'records'``.

    typ : {{'frame', 'series'}}, default 'frame'
        The type of object to recover.

    dtype : bool or dict, default None
        If True, infer dtypes; if a dict of column to dtype, then use those;
        if False, then don't infer dtypes at all, applies only to the data.

        For all ``orient`` values except ``'table'``, default is True.

    convert_axes : bool, default None
        Try to convert the axes to the proper dtypes.

        For all ``orient`` values except ``'table'``, default is True.

    convert_dates : bool or list of str, default True
        If True then default datelike columns may be converted (depending on
        keep_default_dates).
        If False, no dates will be converted.
        If a list of column names, then those columns will be converted and
        default datelike columns may also be converted (depending on
        keep_default_dates).

    keep_default_dates : bool, default True
        If parsing dates (convert_dates is not False), then try to parse the
        default datelike columns.
        A column label is datelike if

        * it ends with ``'_at'``,

        * it ends with ``'_time'``,

        * it begins with ``'timestamp'``,

        * it is ``'modified'``, or

        * it is ``'date'``.

    precise_float : bool, default False
        Set to enable usage of higher precision (strtod) function when
        decoding string to double values. Default (False) is to use fast but
        less precise builtin functionality.

    date_unit : str, default None
        The timestamp unit to detect if converting dates. The default behaviour
        is to try and detect the correct precision, but if this is not desired
        then pass one of 's', 'ms', 'us' or 'ns' to force parsing only seconds,
        milliseconds, microseconds or nanoseconds respectively.

    encoding : str, default is 'utf-8'
        The encoding to use to decode py3 bytes.

    encoding_errors : str, optional, default "strict"
        How encoding errors are treated. `List of possible values
        <https://docs.python.org/3/library/codecs.html#error-handlers>`_ .

        .. versionadded:: 1.3.0

    lines : bool, default False
        Read the file as a json object per line.

    chunksize : int, optional
        Return JsonReader object for iteration.
        See the `line-delimited json docs
        <https://pandas.pydata.org/pandas-docs/stable/user_guide/io.html#line-delimited-json>`_
        for more information on ``chunksize``.
        This can only be passed if `lines=True`.
        If this is None, the file will be read into memory all at once.
    {decompression_options}

        .. versionchanged:: 1.4.0 Zstandard support.

    nrows : int, optional
        The number of lines from the line-delimited jsonfile that has to be read.
        This can only be passed if `lines=True`.
        If this is None, all the rows will be returned.

    {storage_options}

    dtype_backend : {{'numpy_nullable', 'pyarrow'}}, default 'numpy_nullable'
        Back-end data type applied to the resultant :class:`DataFrame`
        (still experimental). Behaviour is as follows:

        * ``"numpy_nullable"``: returns nullable-dtype-backed :class:`DataFrame`
          (default).
        * ``"pyarrow"``: returns pyarrow-backed nullable :class:`ArrowDtype`
          DataFrame.

        .. versionadded:: 2.0

    engine : {{"ujson", "pyarrow"}}, default "ujson"
        Parser engine to use. The ``"pyarrow"`` engine is only available when
        ``lines=True``.

        .. versionadded:: 2.0

    Returns
    -------
    Series, DataFrame, or pandas.api.typing.JsonReader
        A JsonReader is returned when ``chunksize`` is not ``0`` or ``None``.
        Otherwise, the type returned depends on the value of ``typ``.

    See Also
    --------
    DataFrame.to_json : Convert a DataFrame to a JSON string.
    Series.to_json : Convert a Series to a JSON string.
    json_normalize : Normalize semi-structured JSON data into a flat table.

    Notes
    -----
    Specific to ``orient='table'``, if a :class:`DataFrame` with a literal
    :class:`Index` name of `index` gets written with :func:`to_json`, the
    subsequent read operation will incorrectly set the :class:`Index` name to
    ``None``. This is because `index` is also used by :func:`DataFrame.to_json`
    to denote a missing :class:`Index` name, and the subsequent
    :func:`read_json` operation cannot distinguish between the two. The same
    limitation is encountered with a :class:`MultiIndex` and any names
    beginning with ``'level_'``.

    Examples
    --------
    >>> from io import StringIO
    >>> df = pd.DataFrame([['a', 'b'], ['c', 'd']],
    ...                   index=['row 1', 'row 2'],
    ...                   columns=['col 1', 'col 2'])

    Encoding/decoding a Dataframe using ``'split'`` formatted JSON:

    >>> df.to_json(orient='split')
        '{{"columns":["col 1","col 2"],"index":["row 1","row 2"],"data":[["a","b"],["c","d"]]}}'
    >>> pd.read_json(StringIO(_), orient='split')
          col 1 col 2
    row 1     a     b
    row 2     c     d

    Encoding/decoding a Dataframe using ``'index'`` formatted JSON:

    >>> df.to_json(orient='index')
    '{{"row 1":{{"col 1":"a","col 2":"b"}},"row 2":{{"col 1":"c","col 2":"d"}}}}'

    >>> pd.read_json(StringIO(_), orient='index')
          col 1 col 2
    row 1     a     b
    row 2     c     d

    Encoding/decoding a Dataframe using ``'records'`` formatted JSON.
    Note that index labels are not preserved with this encoding.

    >>> df.to_json(orient='records')
    '[{{"col 1":"a","col 2":"b"}},{{"col 1":"c","col 2":"d"}}]'
    >>> pd.read_json(StringIO(_), orient='records')
      col 1 col 2
    0     a     b
    1     c     d

    Encoding with Table Schema

    >>> df.to_json(orient='table')
        '{{"schema":{{"fields":[{{"name":"index","type":"string"}},{{"name":"col 1","type":"string"}},{{"name":"col 2","type":"string"}}],"primaryKey":["index"],"pandas_version":"1.4.0"}},"data":[{{"index":"row 1","col 1":"a","col 2":"b"}},{{"index":"row 2","col 1":"c","col 2":"d"}}]}}'

    The following example uses ``dtype_backend="numpy_nullable"``

    >>> data = '''{{"index": {{"0": 0, "1": 1}},
    ...        "a": {{"0": 1, "1": null}},
    ...        "b": {{"0": 2.5, "1": 4.5}},
    ...        "c": {{"0": true, "1": false}},
    ...        "d": {{"0": "a", "1": "b"}},
    ...        "e": {{"0": 1577.2, "1": 1577.1}}}}'''
    >>> pd.read_json(StringIO(data), dtype_backend="numpy_nullable")
       index     a    b      c  d       e
    0      0     1  2.5   True  a  1577.2
    1      1  <NA>  4.5  False  b  1577.1
    rgz)cannot pass both dtype and orient='table'z0cannot pass both convert_axes and orient='table'TrOr�r�r�r�r�r�rSr�rUr�rVr�rIr�r�r�)rjr�
JsonReader�read)rHrOr�r�r�r�r�r�rSr�r�rUr�rVr�rIr�r��json_readers                   rZr�r��s-��Z���U��D�E�E�
���\��K�L�L��
�&��}��7�*������'� 1��������
���	�
"��$�
�.��$���������� ����(�� (�!�"$�#�$�%�K�*������!�!r]c�0�eZdZdZddej
df																									dd�Zd�Zd�Zdd�Z	e
dd	��Ze
dd
��Ze
dd��Zdd�Zdd
�Zdd�Z
dd�Ze
dd��Ze
dd��Ze
dd��Zdd�Zdd�Z								dd�Zy)r�z�
    JsonReader provides an interface for reading in a JSON file.

    If initialized with ``lines=True`` and ``chunksize``, can be iterated over
    ``chunksize`` lines at a time. Otherwise, calling ``read`` reads in the
    whole document.
    Nr�r�c���||_||_||_||_||_||_||_|	|_|
|_||_	|
|_
||_||_||_
d|_||_||_d|_||_|jdvrt'd|j�d���|j�Mt)d|jd�|_
|jst'd��|jdk(rt'd	��|j�3t)d
|jd�|_|jst'd��t+|t,�r4|js(d|vr$t/j0d
t2t5���|jdk(r|jst'd��||_y|jdk(r(|j9|�}|j;|�|_yy)Nr>r��pyarrowzThe engine type z is currently not supported.r�r�z*chunksize can only be passed if lines=Truer�z<currently pyarrow engine doesn't support chunksize parameterr�z&nrows can only be passed if lines=True�
��Passing literal json to 'read_json' is deprecated and will be removed in a future version. To read from a literal string, wrap it in a 'StringIO' object.��
stacklevelzEcurrently pyarrow engine only supports the line-delimited JSON formatr�)rOr�r�r�r�r�r�rSr�r�rVrIrUr��
nrows_seenr�r�rvr�rjr4rkry�warnings�warn�
FutureWarningrr��_get_data_from_filepath�_preprocess_data)r~�filepath_or_bufferrOr�r�r�r�r�r�rSr�rUr�rVr�rIr�r�r�r�s                    rZrzJsonReader.__init__;s���*��������
�(���*���"4���*���"��� ��
����&���.�����
�"��������
�.���.2���*����;�;�2�2��"�4�;�;�-�/K�L��
��>�>�%�-�k�4�>�>�1�M�D�N��:�:� �!M�N�N��{�{�i�'� �R����:�:�!�)�'�4�:�:�q�A�D�J��:�:� �!I�J�J��)�3�/��J�J��*�*��M�M�B��+�-�
��;�;�)�#��:�:� �5���+�D�I�
�[�[�G�
#��/�/�0B�C�D��-�-�d�3�D�I�$r]c��t|d�r3|js'|js|5|j�}ddd�t|d�s#|js|jrt	|�}|S#1swY�:xYw)a&
        At this point, the data either has a `read` attribute (e.g. a file
        object or a StringIO) or is a string that is a JSON document.

        If self.chunksize, we prepare the data for the `__next__` method.
        Otherwise, we read it into memory for the `read` method.
        r�N)�hasattrr�r�r�r)r~r�s  rZr�zJsonReader._preprocess_data�sY���4�� �$�.�.�D�J�J���y�y�{����t�V�$�$�.�.�D�J�J��D�>�D�����s�A1�1A:c�2�t|�}t|t�r!t|�st	|�st|�rVt
|d|j|j|j|j��|_|jj}|St|t�rP|j�jdtd�t D��z�rt|�st#d|�d���t%j&dt(t+���|S)	a�
        The function read_json accepts three input types:
            1. filepath (string-like)
            2. file-like object (e.g. open file object, StringIO)
            3. JSON string

        This method turns (1) into (2) to simplify the rest of the processing.
        It returns input types (2) and (3) unchanged.

        It raises FileNotFoundError if the input is a string ending in
        one of .json, .json.gz, .json.bz2, etc. but no such file exists.
        �r)r�rVrI�errors)�.jsonc3�&K�|]	}d|�����y�w)r�NrL)�.0�cs  rZ�	<genexpr>z5JsonReader._get_data_from_filepath.<locals>.<genexpr>�s����"Q�8P�1�U�1�#�;�8P�s�zFile z does not existr�r�)r0rkryr/r-r+r,r�rVrIr�rvrr�lower�endswith�tupler*�FileNotFoundErrorr�r�r�r)r~r�s  rZr�z"JsonReader._get_data_from_filepath�s��,�,>�?���-�s�3��(�)��/�0��-�.�%�"����� �,�,� $� 4� 4��+�+�
�D�L�"&���!4�!4��""�!�
�)�3�/�"�(�(�*�3�3��U�"Q�8P�"Q�Q�Q�� � 2�3�#�e�,>�+?��$O�P�P��M�M�B��+�-�
�"�!r]c�f�ddjd�|D�D�cgc]}|s�|��	c}��d�Scc}w)zG
        Combines a list of JSON objects into one JSON object.
        �[�,c3�<K�|]}|j����y�wrK)�strip)r��lines  rZr�z,JsonReader._combine_lines.<locals>.<genexpr>�s����+K�U�T�D�J�J�L�U�s��])�join)r~rUr�s   rZ�_combine_lineszJsonReader._combine_lines�s<��
����+K�U�+K�T�+K�4�t�$�+K�T�U�V�VW�X�	
��Ts�.
�.
c��yrKrLr�s rZr�zJsonReader.read����r]c��yrKrLr�s rZr�zJsonReader.read�r�r]c��yrKrLr�s rZr�zJsonReader.read�r�r]c���|5|jdk(r�td�}|j|j�}|jdk(rt
}n)|jdk(rddlm}|�j}nd}|j|��cddd�S|jdk(�r+|jr�|jrt|�}n�|jrLtt|j|j��}|j!|�}|j#|�}nbt%|j�}|j'd	�}	|j#|j!|	��}n|j#|j�}|jt(j*ur&|j-d
|j��cddd�S|cddd�S	ddd�y#1swYyxYw)zA
        Read the whole JSON input into a pandas object.
        r�zpyarrow.json�numpy_nullabler)�_arrow_dtype_mappingN)�types_mapperr�r�F��
infer_objectsr�)r�rr�r�r�r�pandas.io._utilr�get�	to_pandasrUr�r&r��listrr��_get_object_parserrr�r�
no_default�convert_dtypes)
r~�pyarrow_json�pa_table�mappingrrNrU�
lines_jsonr��
data_liness
          rZr�zJsonReader.read�s���
��{�{�i�'�9�.�I��'�1�1�$�)�)�<���%�%��2�(�G��'�'�+;�;�D�2�4�8�8�G�"�G��)�)�w�)�?��T� ����'��:�:��~�~�$�T�l����� $�V�D�I�I�t�z�z�%B� C��%)�%8�%8��%?�
�"�5�5�j�A��)�$�)�)�4��%)�Z�Z��%5�
�"�5�5�d�6I�6I�*�6U�V���1�1�$�)�)�<�C��%�%�S�^�^�;��-�-�&+�4�;M�;M�.��?�T�F�G�T� (�!�T�T�s�BG�D&G�G�G!c	��|j}|j}|j|j|j|j|j
|j|j|jd�}d}|dk(rt|fi|��j�}|dk(s|�/t|t�s||d<t|fi|��j�}|S)z>
        Parses a json document into a pandas object.
        )rOr�r�r�r�r�rSr�NrErFr�)r�r�rOr�r�r�r�rSr��FrameParser�parserkr��SeriesParser)r~�jsonr�r��kwargsrNs      rZrzJsonReader._get_object_parser	s����h�h���
�
���k�k��Z�Z� �-�-�!�/�/�"&�"9�"9�!�/�/����!�/�/�	
�����'�>��d�-�f�-�3�3�5�C��(�?�c�k��e�T�*�"'��w���t�.�v�.�4�4�6�C��
r]c�R�|j�|jj�yy)z�
        If we opened a stream earlier, in _get_data_from_filepath, we should
        close it.

        If an open stream or file was passed, we leave it open.
        N)rv�closer�s rZrzJsonReader.close$s#���<�<�#��L�L��� �$r]c��|SrKrLr�s rZ�__iter__zJsonReader.__iter__.����r]c��yrKrLr�s rZ�__next__zJsonReader.__next__1r�r]c��yrKrLr�s rZr zJsonReader.__next__5r�r]c��yrKrLr�s rZr zJsonReader.__next__9r�r]c��|jr/|j|jk\r|j�t�t	t|j|j��}|s|j�t�	|j|�}|j|�}t|j|jt|�z�|_|xjt|�z
c_|jtj ur|j#d|j��S|S#t$r}|j�|�d}~wwxYw)NFr)r�r�r�
StopIterationrrr�r�r�r�ranger�rW�	Exceptionr�rr
r)r~rUrrN�exs     rZr zJsonReader.__next__=s���:�:�$�/�/�T�Z�Z�7��J�J�L����V�D�I�I�t�~�~�6�7����J�J�L���		��,�,�U�3�J��)�)�*�5�C��d�o�o�t����S��/I�J�C�I��O�O�s�3�x�'�O�
���S�^�^�3��%�%�#�4�3E�3E�&��
��J���	��J�J�L��H��	�s�>A1D*�*	E
�3E�E
c��|SrKrLr�s rZ�	__enter__zJsonReader.__enter__Yrr]c�$�|j�yrK)r)r~�exc_type�	exc_value�	tracebacks    rZ�__exit__zJsonReader.__exit__\s
��	
�
�
�r])r�rDr��bool | Noner�r�r�r�rUr�r��
int | NonerVr8r�r0rI�StorageOptions | Noner�r�r��DtypeBackend | lib.NoDefaultr�r=r�r�r�)r~�JsonReader[Literal['frame']]r�r)r~�JsonReader[Literal['series']]r�r")r~z&JsonReader[Literal['frame', 'series']]r��DataFrame | Series)r�r5r�)r�r@)r+ztype[BaseException] | Noner,zBaseException | Noner-zTracebackType | Noner�r�)r�r�r��__doc__rr
rr�r�r�rr�rrrr r)r.rLr]rZr�r�2s����026�&.�69�n�n�$�'N4��	N4�"�
N4�!�N4��N4��N4��N4�(�N4��N4� /�!N4�"$�#N4�$4�%N4�&�'N4�(
�)N4�`� ."�`
�������������(�T�6!���������������8��,��(��(�	�

�r]r�c��eZdZUded<ded<dZdddd	d�Zded
<dddd
d
dejf															dd�Ze	dd��Z
e	d��Zdd�Ze	dd��Z
dd�Ze				d											dd��Ze	dd��Zy)�Parserztuple[str, ...]�_split_keysryrz)ru�ms�us�nsi�3�l,b/l`'�rlF[L'�rNTFc
�n�||_|�|j}||_||_|�K|j	�}||j
vrt
d|j
����|j||_n|jd|_||_	||_
||_||_||_
d|_|	|_y)Nzdate_unit must be one of ru)rrzrOr�r��_STAMP_UNITSrj�_MIN_STAMPS�	min_stampr�r�r�rSr�rNr�)
r~rrOr�r�r�r�r�rSr�s
          rZrzParser.__init__rs�����	��>��)�)�F������
�� �!���)�I��� 1� 1�1� �#<�T�=N�=N�<O�!P�Q�Q�!�-�-�i�8�D�N�!�-�-�c�2�D�N�*���(���*���"���"4���.2���*��r]c��t|j��jt|j��}|rdj	|�}td|����y)zT
        Checks that dict has only the appropriate keys for orient='split'.
        z, z!JSON data had unexpected key(s): N)r��keys�
differencer9r�rj)r~�decoded�bad_keys�bad_keys_joineds    rZ�check_keys_splitzParser.check_keys_split�sT��
�w�|�|�~�&�1�1�#�d�6F�6F�2G�H���"�i�i��1�O��@��@Q�R�S�S�r]c��|j�|j�y|jr|j�|j	�|jSrK)�_parserNr��
_convert_axes�_try_convert_typesr�s rZrzParser.parse�sB�����
��8�8��������� ����!��x�x�r]c��t|��rKrr�s rZrIz
Parser._parse�r�r]c�.�|j}|�J�|jD]v}|j|�}t||jd��}|j||ddd��\}}|s�Ht
||jd��}t|j||��xy)z&
        Try to convert axes.
        NF)r�r�T)rhr��
use_dtypesr��is_axis)rN�_AXIS_ORDERS�	_get_axisr"r��_try_convert_datar �setattr)r~rN�	axis_name�ax�ser�new_ser�result�new_axiss        rZrJzParser._convert_axes�s���
�h�h�������)�)�I����y�)�B���2�8�8�%�8�C�"�4�4��� �"��5��O�G�V�� ���
�
�E�J������)�X�6�*r]c��t|��rKrr�s rZrKzParser._try_convert_types�r�r]c�V�|r�|jsttt|��r|dfStj�5tj
ddt��|jtj�}ddd�|dfS|jdurnWt|jt�r|jj|�n|j}|�	|j|�dfS|r|j!|�\}}	|	r|dfSd}
|j"t$j&ur|s|dfSt)|j�r	|jd�}d}
|jj*dk(r#|jdk7r	|jd�}d}
t-|�r7|jd	vr)	|jd
�}||k(j�r|}d}
|jdk(r#|jd
k7r	|jd
�}d}
|dk(rt-|�r|j0d
k(r|dfS||
fS#1swYdfSxYw#ttf$r|dfcYSwxYw#ttf$rY��wxYw#ttf$rY��wxYw#ttt.f$rY��wxYw#ttf$rY��wxYw)zI
        Try to parse a Series into a column by inferring dtype.
        F�ignorezDowncasting object dtype arrays��categoryNT�float64�f)�float�object�int64r�rWr�)r��allr$r��catch_warnings�filterwarningsr��fillna�np�nanrk�dictr	�astype�	TypeErrorrj�_try_convert_to_dater�rr
r�kindr��
OverflowErrorrO)r~rhr�rNr�rO�filledr��new_datarX�	converteds           rZrRzParser._try_convert_data�s�����:�:��u�T�{�#���;�&��,�,�.��+�+� �9�!.��
"�[�[����0�F�
/��t�|�#����t�#��-7�t�z�z�4�,H�D�J�J�N�N�4�(�d�j�j���$�+�#�{�{�5�1�4�7�7��#�8�8��>��H�f����~�%��	����S�^�^�3�G���:��
�T�Z�Z�
(�
��{�{�9�-�� �	��:�:�?�?�c�!�d�j�j�I�&=�
��{�{�9�-�� �	�
�t�9����':�:�
��;�;�w�/����$�)�)�+�#�D� $�I��:�:���4�:�:��#8�
��{�{�7�+�� �	�
�7�?�s�4�y��{�{�g�%��U�{�"��Y���Q/��t�|�#��&�z�2�+�#�U�{�*�+��"�z�*�
��
���z�*�
��
���z�=�9�
��
���z�*�
��
�se�<H)�H8�?I�;I'�((I<�/J�)H5�8I�
I�I$�#I$�'I9�8I9�<J�J�J(�'J(c�2�t|�s|dfS|}|jdk(r|jt�}|jdk(r	|jd�}t|jjtj�rMt|j�||jkDz|jtk(z}|j�s|dfS|j r
|j fn|j"}|D]O}	t%j&�5t%j(ddt*��t-|d|�	�}d
d
d
�|dfcS|dfS#t$r|dfcYSt
tf$rY��wxYw#1swY�8xYw#ttt
f$rY��wxYw)z�
        Try to parse a ndarray like into a date column.

        Try to coerce object in epoch/iso formats and integer/float in epoch
        formats. Return a boolean if parsing was successful.
        F�stringrbrcr\z=.*parsing datetimes with mixed time zones will raise an errorr]�raise)r��unitNT)r�r�rkrbrorlrj�
issubclass�typerh�numberr#�_valuesr@rrdrSr>r�rerfr�r%)r~r�rq�in_range�
date_unitsrSs      rZrmzParser._try_convert_to_date#s����4�y���;�����>�>�X�%����v�.�H��>�>�X�%�
��;�;�w�/���h�n�n�)�)�2�9�9�5��X�%�%�&��d�n�n�,�.��#�#�t�+�-�
�
�<�<�>��U�{�"�*.�.�.�d�n�n�&�d�>O�>O�
�#�I�

��,�,�.��+�+� �4�!.�	� +�8�G�)�T�H�/��T�>�!�$��U�{���;!�
#��U�{�"��z�*�
��
�� /�.���
�y�9�
��
�sB�E�=E?�+E3�<E?�
E0�E0�/E0�3E<	�8E?�?F�F)rryr��DtypeArg | Noner�r�r��bool | list[str]r�r�r�r�r�r2r�r�)rDrjr�r�r�)TTF)rhr5r�r"rNr�r�r~rOr�r��tuple[Series, bool])r�r"r�r)r�r�r�r�r>r?rr
rrrGrrIrJrKrRrmrLr]rZr8r8esZ�� � ���*�L�
����	�K��I�"&�!�*.�#(�#��69�n�n�#+��#+��	#+�
�#+�(�
#+�!�#+��#+�4�#+�
�#+�J�T��T�����(��7��7�((��
 �*.��
Y��Y��Y��	Y�
(�Y��
Y�
�Y��Y�v�0��0r]r8c�0�eZdZUdZdZded<dd�Zdd�Zy)	rrW)rhrWr�z
Series | NonerNc�,�t|j|j��}|jdk(rN|j	�D��cic]\}}t|�|��}}}|j
|�tdi|��|_yt|�|_ycc}}w)N�r�r�rL)	rrr�rO�itemsryrGr"rN)r~r��k�vrDs     rZrIzSeriesParser._parse\sw���4�9�9�D�4F�4F�G���;�;�'�!�-1�Z�Z�\�:�\�T�Q��s�1�v�q�y�\�G�:��!�!�'�*��(��(�D�H��d�|�D�H��	;s�Bc��|j�y|jd|j|j��\}}|r||_yy)Nr��r�)rNrRr�)r~rNrXs   rZrKzSeriesParser._try_convert_typesfsJ���8�8����,�,��D�H�H�D�,>�,>�-�
���V���D�H�r]Nr�)r�r�r�rzr9r�rIrKrLr]rZrrWs���O�+�K�	��$�r]rc�N�eZdZUdZdZded<d
d�Z	d					dd�Zd
d�Zd
d	�Z	y)
rrd)rdrWr�zDataFrame | NonerNc��|j}|j}|dk(r(tt||j��d��|_y|dk(r�t||j��j
�D��cic]\}}t|�|��}}}|j|�|dD�cgc]}t|t�rt|�n|��!}}t|t|d��|d<td	ddi|��|_y|dk(r3tjt||j��dd��|_y|dk(rt||j��|_ytt||j��d��|_ycc}}wcc}w)
Nrdr�)r�r�r�rW)r�rOrgrL)rrOrrr�rNr�ryrGrkrr�r)r.�	from_dictr3)r~rrOr�r�rD�col�
orig_namess        rZrIzFrameParser._parseus{���y�y�������Y�� ��D��0B�0B�C�4��D�H��w�
�(��D�<N�<N�O�U�U�W��W�D�A�q��A���	�W�
��
�!�!�'�*�#�9�-��-�C� *�#�t�4��s��#�=�-�
��"-��(��T�:�"�G�I��!�7�t�7�w�7�D�H�
�w�
� �*�*��D��0B�0B�C����D�H�
�w�
�)�$�d�>P�>P�Q�D�H� ��D��0B�0B�C�4��D�H��/��
s�3E8�$$E>Nc� �|�d�}|j}|�J�d}i}t|j��D]'\}\}}||�r|||�\}	}
|
r|	}d}|||<�)|r0t||j��}|j
|_||_yy)zM
        Take a conversion function and possibly recreate the frame.
        Nc��y)NTrL)r�s rZr�z0FrameParser._process_converter.<locals>.<lambda>�s��tr]FTr�)rN�	enumerater�rrWrd)r~r`�filtrN�
needs_new_obj�new_obj�ir�r�rqrX�	new_frames            rZ�_process_converterzFrameParser._process_converter�s����<�#�D��h�h�������
���$�S�Y�Y�[�1�K�A�x��Q��C�y�#$�S�!�9� ��&�� �A�$(�M��G�A�J�
2��!�'����;�I� #���I�� �D�H�	r]c����j�y�jr�j��j�fd��y)Nc�,���j||d��S)NFr�)rR�r�r�r~s  �rZr�z0FrameParser._try_convert_types.<locals>.<lambda>�s���4�1�1�#�q��1�Nr])rNr��_try_convert_datesr�r�s`rZrKzFrameParser._try_convert_types�s6����8�8�������#�#�%����N�	
r]c�����j�y�j}t|t�rg}t	|��d��fd�}�j�fd�|��y)Nc����|�vry�jsyt|t�sy|j�}|j	d�s |dk(s|dk(s|dk(s|jd�ryy)zK
            Return if this col is ok to try for a date parse.
            TF)�_at�_time�modified�date�datetime�	timestamp)r�rkryr�r��
startswith)r��	col_lowerr�r~s  ��rZ�is_okz-FrameParser._try_convert_dates.<locals>.is_ok�so����m�#���*�*���c�3�'���	�	��I��"�"�#3�4��
�*���&��
�*��'�'��4��r]c�&���j|�SrK)rmr�s  �rZr�z0FrameParser._try_convert_dates.<locals>.<lambda>�s���t�/H�/H��/Kr])r�)r�r�)rNr�rkr�r�r�)r~�convert_dates_list_boolr�r�s`  @rZr�zFrameParser._try_convert_dates�sV����8�8���#'�"4�"4���-�t�4�&(�#��3�4�
�	�,	
��� K�RW��Xr]r�rK)r`z1Callable[[Hashable, Series], tuple[Series, bool]]r�z!Callable[[Hashable], bool] | Noner�r�)
r�r�r�rzr9r�rIr�rKr�rLr]rZrrpsI���O�.�K�	��"�N37�!�<�!�0�!�
�	!�<
� Yr]r)............)rHz0FilePath | WriteBuffer[str] | WriteBuffer[bytes]rNrCrOr�rPryrQr�rRr�rSryrTr�rUr�rVr8rWr/rXr�rIrArY�Literal['a', 'w']r�r�)rHr�rNrCrOr�rPryrQr�rRr�rSryrTr�rUr�rVr8rWr/rXr�rIrArYr�r�ry)N�epoch�
Tr:NFr_NrNrf)rHz7FilePath | WriteBuffer[str] | WriteBuffer[bytes] | NonerNrCrOr�rPryrQr�rRr�rSryrTr�rUr�rVr8rWr/rXr�rIr1rYr�r�r�)&rH�.FilePath | ReadBuffer[str] | ReadBuffer[bytes]rOr�r��Literal['frame']r�r}r�r/r�r~r�r�r�r�rSr�r�r�r�r�rUr�r�r�rVr8r�r0rIrAr�r2r�r=r�r3)&rHr�rOr�r��Literal['series']r�r}r�r/r�r~r�r�r�r�rSr�r�r�r�r�rUr�r�r�rVr8r�r0rIrAr�r2r�r=r�r4)&rHr�rOr�r�r�r�r}r�r/r�r~r�r�r�r�rSr�r�r�r�r�rUr�r�r�rVr8r�r0rIrAr�r2r�r=r�r")&rHr�rOr�r�r�r�r}r�r/r�r~r�r�r�r�rSr�r�r�r�r�rUr�r�r�rVr8r�r0rIrAr�r2r�r=r�r)&rHr�rOr�r�zLiteral['frame', 'series']r�r}r�r/r�r~r�r�r�r�rSr�r�r�r�r�rUr�r�r0rVr8r�r0rIr1r�r2r�r=r�zDataFrame | Series | JsonReader)j�
__future__rrrr�collections�ior�	itertoolsr�typingr	r
rrr
rrrr��numpyrh�pandas._libsr�pandas._libs.jsonrr�pandas._libs.tslibsr�pandas.compat._optionalr�
pandas.errorsr�pandas.util._decoratorsr�pandas.util._exceptionsr�pandas.util._validatorsr�pandas.core.dtypes.commonrr�pandas.core.dtypes.dtypesr�pandasrrr r!r"r#r$r%�pandas.core.reshape.concatr&�pandas.core.shared_docsr'�pandas.io.commonr(r)r*r+r,r-r.r/r0�pandas.io.json._normalizer1�pandas.io.json._table_schemar2r3�pandas.io.parsers.readersr4�collections.abcr5r6�typesr7�pandas._typingr8r9r:r;r<r=r>r?r@rArB�pandas.core.genericrCrDr[rxrnrormr�r
�Iteratorr�r8rrrLr]rZ�<module>r�s���"�����	�	�	�����%�>�-�'�4�7��2�	�	�	�.�0�
�
�
�?��7���$�����,��+�7�;L�3M�N��
������@C��&)���&)�!��A��	��
���	�
���
���>����$����
��$����
��
��&
������@C��&)���&)�!����	��
���	�
���
���>����$����
��$����	��
��,�����@D��&-���-1�!�S�H�S�	�S�
�S��	S�
�S��
S��S�>�S��S�$�S��S�
�S�+�S��S��S�l3.�S�3.�lX�6�X��&��<M9�k�M9�`
��� � #�&)�"����"%��&)��&)�25��'�?��
��
�	�
���
�$���������� ������$�� �!�"$�#�$0�%�&
�'�("�)�
��0
�� � #�&)�"����"%��&)��&)�25��'�?��
��
�	�
���
�$���������� ������$�� �!�"$�#�$0�%�&
�'�(#�)�
��0
�� � #�&)�"����"%���&)��&)�25��'�?��
��
�	�
���
�$���������� ������$�� �!�"$�#�$0�%�&
�'�(�)�
��0
��� � #�&)�"����"%���&)��&)�25��'�?��
��
�	�
���
�$���������� ������$�� �!�"$�#�$0�%�&
�'�(�)�
��0� �!2�3�&�'>�?�-�O���&-�!� $�&*�#�� ��"*�� �&-��-1�25�.�.� �'r"�?�r"�
�r"�
$�	r"�
�r"��
r"�$�r"��r"��r"��r"��r"� �r"��r"��r"�$�r"� �!r"�"+�#r"�$0�%r"�&
�'r"�(%�)r"�	�r"�j	p����w��7�p�f	o�o�d�6��2qY�&�qYr]

Sindbad File Manager Version 1.0, Coded By Sindbad EG ~ The Terrorists