Skip to content

Commit

Permalink
Release v0.6.1 (#254)
Browse files Browse the repository at this point in the history
# Release notes

- Add docker profiles (#246)
- Add Context guide (#244)
- Rework basic guide (#209)
- Change documentation build configuration (#250, #249)
- Various documentation updates (#245, #244, #246)
  • Loading branch information
RLKRo authored Oct 17, 2023
1 parent d97a569 commit 3e37ae3
Show file tree
Hide file tree
Showing 18 changed files with 678 additions and 287 deletions.
3 changes: 1 addition & 2 deletions .github/workflows/build_and_publish_docs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,6 @@ on:
pull_request:
branches:
- dev
- master
workflow_dispatch:

concurrency:
Expand All @@ -30,7 +29,7 @@ jobs:

- name: Build images
run: |
docker-compose up -d
make docker_up
- uses: r-lib/actions/setup-pandoc@v2
with:
Expand Down
1 change: 0 additions & 1 deletion .github/workflows/codestyle.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,6 @@ on:
pull_request:
branches:
- dev
- master
workflow_dispatch:

concurrency:
Expand Down
3 changes: 1 addition & 2 deletions .github/workflows/test_coverage.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,6 @@ on:
pull_request:
branches:
- dev
- master
workflow_dispatch:

concurrency:
Expand All @@ -28,7 +27,7 @@ jobs:

- name: Build images
run: |
docker-compose up -d
make docker_up
- name: set up python 3.10
uses: actions/setup-python@v4
Expand Down
5 changes: 2 additions & 3 deletions .github/workflows/test_full.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,6 @@ on:
pull_request:
branches:
- dev
- master
workflow_dispatch:

concurrency:
Expand All @@ -31,7 +30,7 @@ jobs:
- name: Build images
if: matrix.os == 'ubuntu-latest'
run: |
docker-compose up -d
make docker_up
- name: set up python ${{ matrix.python-version }}
uses: actions/setup-python@v4
Expand Down Expand Up @@ -65,7 +64,7 @@ jobs:

- name: Build images
run: |
docker-compose up -d
make docker_up
- name: set up python 3.8
uses: actions/setup-python@v4
Expand Down
60 changes: 50 additions & 10 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,26 @@ WARNING! Because of the current patching solution, `make doc` modifies some of t
so it is strongly advised to use it carefully and in virtual environment only.
However, this behavior is likely to be changed in the future.

#### Documentation links

In your tutorials, you can use special expanding directives in markdown cells.
They can help shorten the comments and avoid boilerplate code.
The documentation links generated by the directives are always relative
to the local documentation and verified during build.

- `%pip install {args}`
This directive generates dependency installation cell, adds a comment and sets up "quiet" flag.

It should be used in tutorials, like this: `# %pip install dff[...]`.
- `%doclink({args})`
This directive generates a documentation link. It supports 2 or three arguments and the generated link will look like: `ARG1/ARG2#ARG3`.

The first argument can be either `api` for DFF codebase, `tutorial` for tutorials or `guide` for user guides.
- `%mddoclink({args})`
This directive is a shortcut for `%doclink` that generates a markdown format link instead.

The generated link will be either `[ARG2](%doclink(ARG1,ARG2))` or `[ARG3](%doclink(ARG1,ARG2,ARG3))`.

### Style
For style supporting we propose `black`, which is a PEP 8 compliant opinionated formatter.
It doesn't take previous formatting into account. See more about [black](https://github.com/psf/black).
Expand Down Expand Up @@ -100,16 +120,36 @@ make format
Tests are configured via [`.env_file`](.env_file).

### Docker
For integration tests, DFF uses Docker images of supported databases as well as docker-compose configuration.
The following images are required for complete integration testing:
1. `mysql`
2. `postgres`
3. `redis`
4. `mongo`
5. `cr.yandex/yc/yandex-docker-local-ydb`

All of them will be downloaded, launched and awaited upon running integration test make command (`make test_all`).
However, they can be downloaded separately with `make docker_up` and awaited with `make wait_db` commands.
DFF uses docker images for two purposes:
1. Database images for integration testing.
2. Images for statistics collection.

The first group can be launched via

```bash
docker-compose --profile context_storage up
```

This will download and run all the databases (`mysql`, `postgres`, `redis`, `mongo`, `ydb`).

The second group can be launched via

```bash
docker-compose --profile stats up
```

This will download and launch Superset Dashboard, Clickhouse, OpenTelemetry Collector.

To launch both groups run
```bash
docker-compose --profile context_storage --profile stats up
```
or
```bash
make docker_up
```

This will be done automatically when running `make test_all`.

### Other provided features
You can get more info about `make` commands by `help`:
Expand Down
2 changes: 1 addition & 1 deletion dff/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

__author__ = "Denis Kuznetsov"
__email__ = "[email protected]"
__version__ = "0.6.0"
__version__ = "0.6.1"


import nest_asyncio
Expand Down
97 changes: 54 additions & 43 deletions dff/script/core/context.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@

def get_last_index(dictionary: dict) -> int:
"""
Obtaining the last index from the `dictionary`. Functions returns `-1` if the `dict` is empty.
Obtain the last index from the `dictionary`. Return `-1` if the `dict` is empty.
:param dictionary: Dictionary with unsorted keys.
:return: Last index from the `dictionary`.
Expand All @@ -44,6 +44,9 @@ def get_last_index(dictionary: dict) -> int:
class Context(BaseModel):
"""
A structure that is used to store data about the context of a dialog.
Avoid storing unserializable data in the fields of this class in order for
context storages to work.
"""

id: Union[UUID, int, str] = Field(default_factory=uuid4)
Expand Down Expand Up @@ -77,26 +80,28 @@ class Context(BaseModel):
`misc` stores any custom data. The scripting doesn't use this dictionary by default,
so storage of any data won't reflect on the work on the internal Dialog Flow Scripting functions.
Avoid storing unserializable data in order for context storages to work.
- key - Arbitrary data name.
- value - Arbitrary data.
"""
validation: bool = False
"""
`validation` is a flag that signals that :py:class:`~dff.script.Pipeline`,
while being initialized, checks the :py:class:`~dff.script.Script`.
`validation` is a flag that signals that :py:class:`~dff.pipeline.pipeline.pipeline.Pipeline`,
while being initialized, checks the :py:class:`~dff.script.core.script.Script`.
The functions that can give not valid data
while being validated must use this flag to take the validation mode into account.
Otherwise the validation will not be passed.
"""
framework_states: Dict[ModuleName, Dict[str, Any]] = {}
"""
`framework_states` is used for addons states or for
:py:class:`~dff.script.Pipeline`'s states.
:py:class:`~dff.script.Pipeline`
:py:class:`~dff.pipeline.pipeline.pipeline.Pipeline`'s states.
:py:class:`~dff.pipeline.pipeline.pipeline.Pipeline`
records all its intermediate conditions into the `framework_states`.
After :py:class:`~dff.script.Context` processing is finished,
:py:class:`~dff.script.Pipeline` resets `framework_states` and
returns :py:class:`~dff.script.Context`.
After :py:class:`~.Context` processing is finished,
:py:class:`~dff.pipeline.pipeline.pipeline.Pipeline` resets `framework_states` and
returns :py:class:`~.Context`.
- key - Temporary variable name.
- value - Temporary variable data.
Expand All @@ -106,7 +111,7 @@ class Context(BaseModel):
@classmethod
def sort_dict_keys(cls, dictionary: dict) -> dict:
"""
Sorting the keys in the `dictionary`. This needs to be done after deserialization,
Sort the keys in the `dictionary`. This needs to be done after deserialization,
since the keys are deserialized in a random order.
:param dictionary: Dictionary with unsorted keys.
Expand All @@ -117,16 +122,15 @@ def sort_dict_keys(cls, dictionary: dict) -> dict:
@classmethod
def cast(cls, ctx: Optional[Union["Context", dict, str]] = None, *args, **kwargs) -> "Context":
"""
Transforms different data types to the objects of
:py:class:`~dff.script.Context` class.
Returns an object of :py:class:`~dff.script.Context`
Transform different data types to the objects of the
:py:class:`~.Context` class.
Return an object of the :py:class:`~.Context`
type that is initialized by the input data.
:param ctx: Different data types, that are used to initialize object of
:py:class:`~dff.script.Context` type.
The empty object of :py:class:`~dff.script.Context`
type is created if no data are given.
:return: Object of :py:class:`~dff.script.Context`
:param ctx: Data that is used to initialize an object of the
:py:class:`~.Context` type.
An empty :py:class:`~.Context` object is returned if no data is given.
:return: Object of the :py:class:`~.Context`
type that is initialized by the input data.
"""
if not ctx:
Expand All @@ -137,14 +141,15 @@ def cast(cls, ctx: Optional[Union["Context", dict, str]] = None, *args, **kwargs
ctx = Context.model_validate_json(ctx)
elif not issubclass(type(ctx), Context):
raise ValueError(
f"context expected as sub class of Context class or object of dict/str(json) type, but got {ctx}"
f"Context expected to be an instance of the Context class "
f"or an instance of the dict/str(json) type. Got: {type(ctx)}"
)
return ctx

def add_request(self, request: Message):
"""
Adds to the context the next `request` corresponding to the next turn.
The addition takes place in the `requests` and `new_index = last_index + 1`.
Add a new `request` to the context.
The new `request` is added with the index of `last_index + 1`.
:param request: `request` to be added to the context.
"""
Expand All @@ -154,8 +159,8 @@ def add_request(self, request: Message):

def add_response(self, response: Message):
"""
Adds to the context the next `response` corresponding to the next turn.
The addition takes place in the `responses`, and `new_index = last_index + 1`.
Add a new `response` to the context.
The new `response` is added with the index of `last_index + 1`.
:param response: `response` to be added to the context.
"""
Expand All @@ -165,9 +170,8 @@ def add_response(self, response: Message):

def add_label(self, label: NodeLabel2Type):
"""
Adds to the context the next :py:const:`label <dff.script.NodeLabel2Type>`,
corresponding to the next turn.
The addition takes place in the `labels`, and `new_index = last_index + 1`.
Add a new :py:data:`~.NodeLabel2Type` to the context.
The new `label` is added with the index of `last_index + 1`.
:param label: `label` that we need to add to the context.
"""
Expand All @@ -180,12 +184,12 @@ def clear(
field_names: Union[Set[str], List[str]] = {"requests", "responses", "labels"},
):
"""
Deletes all recordings from the `requests`/`responses`/`labels` except for
Delete all records from the `requests`/`responses`/`labels` except for
the last `hold_last_n_indices` turns.
If `field_names` contains `misc` field, `misc` field is fully cleared.
:param hold_last_n_indices: Number of last turns that remain under clearing.
:param field_names: Properties of :py:class:`~dff.script.Context` we need to clear.
:param hold_last_n_indices: Number of last turns to keep.
:param field_names: Properties of :py:class:`~.Context` to clear.
Defaults to {"requests", "responses", "labels"}
"""
field_names = field_names if isinstance(field_names, set) else set(field_names)
Expand All @@ -206,26 +210,29 @@ def clear(
@property
def last_label(self) -> Optional[NodeLabel2Type]:
"""
Returns the last :py:const:`~dff.script.NodeLabel2Type` of
the :py:class:`~dff.script.Context`.
Returns `None` if `labels` is empty.
Return the last :py:data:`~.NodeLabel2Type` of
the :py:class:`~.Context`.
Return `None` if `labels` is empty.
Since `start_label` is not added to the `labels` field,
empty `labels` usually indicates that the current node is the `start_node`.
"""
last_index = get_last_index(self.labels)
return self.labels.get(last_index)

@property
def last_response(self) -> Optional[Message]:
"""
Returns the last `response` of the current :py:class:`~dff.script.Context`.
Returns `None` if `responses` is empty.
Return the last `response` of the current :py:class:`~.Context`.
Return `None` if `responses` is empty.
"""
last_index = get_last_index(self.responses)
return self.responses.get(last_index)

@last_response.setter
def last_response(self, response: Optional[Message]):
"""
Sets the last `response` of the current :py:class:`~dff.core.engine.core.context.Context`.
Set the last `response` of the current :py:class:`~.Context`.
Required for use with various response wrappers.
"""
last_index = get_last_index(self.responses)
Expand All @@ -234,16 +241,16 @@ def last_response(self, response: Optional[Message]):
@property
def last_request(self) -> Optional[Message]:
"""
Returns the last `request` of the current :py:class:`~dff.script.Context`.
Returns `None` if `requests` is empty.
Return the last `request` of the current :py:class:`~.Context`.
Return `None` if `requests` is empty.
"""
last_index = get_last_index(self.requests)
return self.requests.get(last_index)

@last_request.setter
def last_request(self, request: Optional[Message]):
"""
Sets the last `request` of the current :py:class:`~dff.core.engine.core.context.Context`.
Set the last `request` of the current :py:class:`~.Context`.
Required for use with various request wrappers.
"""
last_index = get_last_index(self.requests)
Expand All @@ -252,7 +259,7 @@ def last_request(self, request: Optional[Message]):
@property
def current_node(self) -> Optional[Node]:
"""
Returns current :py:class:`~dff.script.Node`.
Return current :py:class:`~dff.script.core.script.Node`.
"""
actor = self.framework_states.get("actor", {})
node = (
Expand All @@ -264,25 +271,29 @@ def current_node(self) -> Optional[Node]:
)
if node is None:
logger.warning(
"The `current_node` exists when an actor is running between `ActorStage.GET_PREVIOUS_NODE`"
" and `ActorStage.FINISH_TURN`"
"The `current_node` method should be called "
"when an actor is running between the "
"`ActorStage.GET_PREVIOUS_NODE` and `ActorStage.FINISH_TURN` stages."
)

return node

def overwrite_current_node_in_processing(self, processed_node: Node):
"""
Overwrites the current node with a processed node. This method only works in processing functions.
Set the current node to be `processed_node`.
This method only works in processing functions (pre-response and pre-transition).
The actual current node is not changed.
:param processed_node: `node` that we need to overwrite current node.
:param processed_node: `node` to set as the current node.
"""
is_processing = self.framework_states.get("actor", {}).get("processed_node")
if is_processing:
self.framework_states["actor"]["processed_node"] = Node.model_validate(processed_node)
else:
logger.warning(
f"The `{self.overwrite_current_node_in_processing.__name__}` "
"function can only be run during processing functions."
"method can only be called from processing functions (either pre-response or pre-transition)."
)


Expand Down
Loading

0 comments on commit 3e37ae3

Please sign in to comment.