Skip to content

Commit 7e3f1f4

Browse files
committed
deploy: f2dcefc
0 parents  commit 7e3f1f4

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

62 files changed

+26108
-0
lines changed

.buildinfo

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
# Sphinx build info version 1
2+
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
3+
config: 6a2af01995ecb4a6bfe4433df65e503d
4+
tags: 645f666f9bcd5a90fca523b33c5a78b7

.nojekyll

Whitespace-only changes.

Makefile

Lines changed: 75 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,75 @@
1+
# Makefile for Sphinx documentation
2+
#
3+
4+
# You can set these variables from the command line.
5+
SPHINXOPTS =
6+
SPHINXBUILD = sphinx-build
7+
PAPER =
8+
9+
# Internal variables.
10+
PAPEROPT_a4 = -D latex_paper_size=a4
11+
PAPEROPT_letter = -D latex_paper_size=letter
12+
ALLSPHINXOPTS = -d .build/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
13+
14+
.PHONY: help clean html web pickle htmlhelp latex changes linkcheck
15+
16+
help:
17+
@echo "Please use \`make <target>' where <target> is one of"
18+
@echo " html to make standalone HTML files"
19+
@echo " pickle to make pickle files"
20+
@echo " json to make JSON files"
21+
@echo " htmlhelp to make HTML files and a HTML help project"
22+
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
23+
@echo " changes to make an overview over all changed/added/deprecated items"
24+
@echo " linkcheck to check all external links for integrity"
25+
26+
clean:
27+
-rm -rf .build/*
28+
29+
html:
30+
mkdir -p .build/html .build/doctrees
31+
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) ../docs
32+
@echo
33+
@echo "Build finished. The HTML pages are in ../docs."
34+
35+
pickle:
36+
mkdir -p .build/pickle .build/doctrees
37+
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) .build/pickle
38+
@echo
39+
@echo "Build finished; now you can process the pickle files."
40+
41+
web: pickle
42+
43+
json:
44+
mkdir -p .build/json .build/doctrees
45+
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) .build/json
46+
@echo
47+
@echo "Build finished; now you can process the JSON files."
48+
49+
htmlhelp:
50+
mkdir -p .build/htmlhelp .build/doctrees
51+
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) .build/htmlhelp
52+
@echo
53+
@echo "Build finished; now you can run HTML Help Workshop with the" \
54+
".hhp project file in .build/htmlhelp."
55+
56+
latex:
57+
mkdir -p .build/latex .build/doctrees
58+
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) .build/latex
59+
@echo
60+
@echo "Build finished; the LaTeX files are in .build/latex."
61+
@echo "Run \`make all-pdf' or \`make all-ps' in that directory to" \
62+
"run these through (pdf)latex."
63+
64+
changes:
65+
mkdir -p .build/changes .build/doctrees
66+
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) .build/changes
67+
@echo
68+
@echo "The overview file is in .build/changes."
69+
70+
linkcheck:
71+
mkdir -p .build/linkcheck .build/doctrees
72+
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) .build/linkcheck
73+
@echo
74+
@echo "Link check complete; look for any errors in the above output " \
75+
"or in .build/linkcheck/output.txt."

_sources/changelog.txt

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
#########
2+
Changelog
3+
#########
4+
5+
* `Version 1.5.0`_ (15.3.2023)
6+
7+
Version 1.5.0
8+
=============
9+

_sources/differences-from-kdb.txt

Lines changed: 114 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,114 @@
1+
============================
2+
Differences from KInterbasDB
3+
============================
4+
5+
No need for initialization
6+
==========================
7+
8+
IDB doesn't support various configurations of automatic type translations like
9+
KDB, so it's no longer necessary to initialize the driver before any feature is
10+
used.
11+
12+
Distributed transactions
13+
========================
14+
15+
Support for :ref:`Distributed Transactions <distributed_transactions>` works slightly
16+
differently than in KDB. IDB uses :class:`~idbIDB.ConnectionGroup` class like KDB with the same
17+
interface, but DT is not bound to main transaction of individual connections managed by group.
18+
That means that :class:`~idbIDB.Cursor` instances obtained from :class:`~idbIDB.Connection` don't work in
19+
DT if connection is part of ConnectionGroup, but work normally in connection context.
20+
To get Cursor for specific connection that works in DT, use :meth:`idbIDB.ConnectionGroup.cursor()`
21+
method and pass the connection as parameter. We believe that this arrangement is more
22+
logical and flexible than KDB's way.
23+
24+
Transaction context for cursor objects depends on how cursor is obtained/created:
25+
26+
a) :meth:`idb.Connection.cursor()` - Works in context of "main" transaction for connection.
27+
b) :meth:`idb.Transaction.cursor()` - Works in context of this transaction.
28+
c) :meth:`idb.ConnectionGroup.cursor()` - Works in context of Distributed Transaction
29+
30+
Stream BLOBs
31+
============
32+
33+
InterBase supports two types of BLOBs, stream and segmented. The database stores
34+
segmented BLOBs in chunks. Each chunk starts with a two byte length indicator
35+
followed by however many bytes of data were passed as a segment. Stream BLOBs
36+
are stored as a continuous array of data bytes with no length indicators included.
37+
Both types of BLOBs could be accessed by the same API functions, but only stream
38+
BLOBs support seek operation (via `isc_seek_blob function`).
39+
40+
IDB implements stream BLOBs as file-like objects. On input, you can simply pass
41+
any file-like object (only 'read' method is required) as parameter value for BLOB
42+
column. For example:
43+
44+
.. code-block:: python
45+
46+
f = open('filename.ext', 'rb')
47+
cur.execute('insert into T (MyBLOB) values (?)',[f])
48+
f.close()
49+
50+
On output, stream BLOBs are represented by BlobReader instances on request. To
51+
request streamed access to BLOB, you have to use prepared statement for your query
52+
and call its `set_stream_blob(column_name)` method. Stream access is not allowed
53+
for cursors because cursors cache prepared statements internally, which would
54+
lead to dangerous situations (BlobReader life-time management) and anomalies
55+
(stream access when it's not required). Example:
56+
57+
.. code-block:: python
58+
59+
p = cursor.prep('select first 1 MyBLOB from T')
60+
p.set_stream_blob('MyBLOB')
61+
cur.execute(p)
62+
row = cur.fetchone()
63+
blob_reader = row[1]
64+
print blob_reader.readlines()
65+
blob_reader.close()
66+
67+
Whenever you use stream access to BLOB, IDB opens or creates the underlying BLOB
68+
value as stream one. On input it means that true stream BLOB is created in database,
69+
but on output it depends on how BLOB value was actually created. If BLOB was
70+
created as stream one, you can use the seek method of BlobReader, but if it was
71+
created as regular BLOB, any call to seek will raise an error::
72+
73+
SQLCODE: -685
74+
- invalid ARRAY or BLOB operation
75+
- invalid BLOB type for operation
76+
77+
You can read BLOBs created as stream ones as fully materialized, and regular ones
78+
in stream mode (without seek) without any problems, and that same apply for
79+
input - you can create values in the same column as stream or regular ones
80+
interchangeably. From your point of view, stream BLOBs are just different
81+
interface to BLOB values, with single exception - `BlobReader.seek()` will throw
82+
an exception if you'd call it on BLOB value that was not created as stream BLOB.
83+
84+
To work with stream BLOBs, you don't need to use `cursor.set_type_trans_in/out`
85+
methods like in KDB, i.e. calls to:
86+
87+
.. code-block:: python
88+
89+
cur.set_type_trans_in ({'BLOB': {'mode': 'stream'}})
90+
cur.set_type_trans_out({'BLOB': {'mode': 'stream'}})
91+
92+
To write (create) stream BLOB value, simply pass file-like object as parameter
93+
to your INSERT/UPDATE statements where BLOB value is expected. To read BLOB
94+
value as stream, use prepared statement and register interest to get BlobReader
95+
instead fully materialized value via set_stream_blob() calls for each BLOB value
96+
(column name) you want to get this way.
97+
98+
:class:`~idb.BlobReader` supports iteration protocol, and read(), readline(), readlines(),
99+
seek(), tell(), flush() (as noop) and close() methods. It does NOT support chunks()
100+
method of KInterbasDB.BlobReader.
101+
102+
It is not strictly necessary to close BlobReader instances explicitly.
103+
A BlobReader object will be automatically closed by its __del__ method when it
104+
goes out of scope, or when its Connection, PreparedStatement closes,
105+
whichever comes first. However, it is always a better idea to close resources
106+
explicitly (via try...finally) than to rely on artifacts of the Python
107+
implementation. You will also encounter errors if BLOB value was deleted from
108+
database before BlobReader is closed, so the odds that this may happen are higher
109+
if you do not close it explicitly.
110+
111+
Services API
112+
============
113+
114+
Support for InterBase Services was :ref:`completelly reworked <working_with_services>` in IDB.

0 commit comments

Comments
 (0)