Skip to content

Commit 0c7cf5b

Browse files
authored
[master < T0068-GA] Update README (#136)
* Update README * Fix broken link * Move emojis * Add table to title
1 parent 3209170 commit 0c7cf5b

1 file changed

Lines changed: 163 additions & 54 deletions

File tree

README.md

Lines changed: 163 additions & 54 deletions
Original file line numberDiff line numberDiff line change
@@ -40,85 +40,194 @@ The project uses [Poetry](https://python-poetry.org/) to build the GQLAlchemy Py
4040
Before starting the tests, make sure you have an active Memgraph instance running. Execute the following command:
4141
`poetry run pytest .`
4242

43-
## GQLAlchemy example
43+
## GQLAlchemy capabilities
4444

45-
When working with the `gqlalchemy`, a Python developer can connect to the database and execute a `MATCH` Cypher query using the following syntax:
45+
<details>
46+
<summary>🗺️ Object graph mapper</summary>
47+
<br>
48+
49+
Below you can see an example of how to create `User` and `Language` node classes, and a relationship class of type `SPEAKS`. Along with that, you can see how to create a new node and relationship and how to save them in the database. After that, you can load those nodes and relationship from the database.
50+
<br>
51+
<br>
52+
53+
```python
54+
from gqlalchemy import Memgraph, Node, Relationship, Field
55+
from typing import Optional
56+
57+
db = Memgraph()
58+
59+
class User(Node, index=True, db=db):
60+
id: str = Field(index=True, exist=True, unique=True, db=db)
61+
62+
class Language(Node):
63+
name: str = Field(unique=True, db=db)
64+
65+
class Speaks(Relationship, type="SPEAKS"):
66+
pass
67+
68+
user = User(id="3", username="John").save(db)
69+
language = Language(name="en").save(db)
70+
speaks_rel = Speaks(
71+
_start_node_id = user._id,
72+
_end_node_id = language._id
73+
).save(db)
74+
75+
loaded_user = User(id="3").load(db=db)
76+
print(loaded_user)
77+
loaded_speaks = Speaks(
78+
_start_node_id=user._id,
79+
_end_node_id=language._id
80+
).load(db)
81+
print(loaded_speaks)
82+
```
83+
</details>
84+
85+
<details>
86+
<summary>🔨 Query builder</summary>
87+
<br>
88+
When building a Cypher query, you can use a set of methods that are wrappers around Cypher clauses.
89+
<br>
90+
<br>
4691

4792
```python
48-
from gqlalchemy import Memgraph
49-
50-
memgraph = Memgraph("127.0.0.1", 7687)
51-
memgraph.execute("CREATE (:Node)-[:Connection]->(:Node)")
52-
results = memgraph.execute_and_fetch("""
53-
MATCH (from:Node)-[:Connection]->(to:Node)
54-
RETURN from, to;
55-
""")
56-
57-
for result in results:
58-
print(result['from'])
59-
print(result['to'])
93+
from gqlalchemy import create, match
94+
95+
query_create = create()
96+
.node(labels="Person", name="Leslie")
97+
.to(edge_label="FRIENDS_WITH")
98+
.node(labels="Person", name="Ron")
99+
.execute()
100+
101+
query_match = match()
102+
.node(labels="Person", variable="p1")
103+
.to()
104+
.node(labels="Person", variable="p2")
105+
.where(item="p1.name", operator="=", literal="Leslie")
106+
.return_({"p1":"p1"})
107+
.execute()
60108
```
109+
</details>
61110

62-
## Query builder example
111+
<details>
112+
<summary>🚰 Manage streams</summary>
113+
<br>
63114

64-
As we can see, the example above can be error-prone, because we do not have abstractions for creating a database connection and `MATCH` query.
115+
You can create and start Kafka or Pulsar stream using GQLAlchemy.
116+
<br>
65117

66-
Now, rewrite the exact same query by using the functionality of GQLAlchemy's query builder:
118+
**Kafka stream**
119+
```python
120+
from gqlalchemy import MemgraphPulsarStream
67121

122+
stream = MemgraphPulsarStream(name="ratings_stream", topics=["ratings"], transform="movielens.rating", service_url="localhost:6650")
123+
db.create_stream(stream)
124+
db.start_stream(stream)
125+
```
126+
127+
**Pulsar stream**
68128
```python
69-
from gqlalchemy import match, Memgraph
129+
from gqlalchemy import MemgraphKafkaStream
130+
131+
stream = MemgraphKafkaStream(name="ratings_stream", topics=["ratings"], transform="movielens.rating", bootstrap_servers="localhost:9093")
132+
db.create_stream(stream)
133+
db.start_stream(stream)
134+
```
135+
</details>
136+
137+
<details>
138+
<summary>🗄️ Import table data from different sources</summary>
139+
<br>
140+
141+
**Import table data to a graph database**
142+
143+
You can translate table data from a file to graph data and import it to Memgraph. Currently, we support reading of CSV, Parquet, ORC and IPC/Feather/Arrow file formats via the PyArrow package.
144+
145+
Read all about it in [table to graph importer how-to guide](https://memgraph.com/docs/gqlalchemy/how-to-guides/loaders/table-to-graph-importer).
70146

71-
memgraph = Memgraph()
147+
**Make a custom file system importer**
72148

73-
results = (
74-
match()
75-
.node("Node", variable="from")
76-
.to("Connection")
77-
.node("Node", variable="to")
78-
.return_()
79-
.execute()
149+
If you want to read from a file system not currently supported by GQLAlchemy, or use a file type currently not readable, you can implement your own by extending abstract classes `FileSystemHandler` and `DataLoader`, respectively.
150+
151+
Read all about it in [custom file system importer how-to guide](https://memgraph.com/docs/gqlalchemy/how-to-guides/loaders/custom-file-system-importer).
152+
153+
</details>
154+
155+
<details>
156+
<summary>⚙️ Manage Memgraph instances</summary>
157+
<br>
158+
159+
You can start, stop, connect to and monitor Memgraph instances with GQLAlchemy.
160+
161+
**Manage Memgraph Docker instance**
162+
163+
```python
164+
from gqlalchemy.instance_runner import (
165+
DockerImage,
166+
MemgraphInstanceDocker
80167
)
81168

82-
for result in results:
83-
print(result["from"])
84-
print(result["to"])
169+
memgraph_instance = MemgraphInstanceDocker(
170+
docker_image=DockerImage.MEMGRAPH, docker_image_tag="latest", host="0.0.0.0", port=7687
171+
)
172+
memgraph = memgraph_instance.start_and_connect(restart=False)
173+
174+
memgraph.execute_and_fetch("RETURN 'Memgraph is running' AS result"))[0]["result"]
85175
```
86176

87-
An example using the `Node` and `Relationship` classes:
177+
**Manage Memgraph binary instance**
88178

89179
```python
90-
from gqlalchemy import Memgraph, Node, Relationship, match, Field
180+
from gqlalchemy.instance_runner import MemgraphInstanceBinary
181+
182+
memgraph_instance = MemgraphInstanceBinary(
183+
host="0.0.0.0", port=7698, binary_path="/usr/lib/memgraph/memgraph", user="memgraph"
184+
)
185+
memgraph = memgraph_instance.start_and_connect(restart=False)
91186

92-
memgraph = Memgraph("127.0.0.1", 7687)
187+
memgraph.execute_and_fetch("RETURN 'Memgraph is running' AS result"))[0]["result"]
188+
```
189+
</details>
93190

191+
<details>
192+
<summary>🔫 Manage database triggers</summary>
193+
<br>
94194

95-
class User(Node):
96-
id: int = Field(index=True, exist=True, unique=True, db=memgraph)
195+
Because Memgraph supports database triggers on `CREATE`, `UPDATE` and `DELETE` operations, GQLAlchemy also implements a simple interface for maintaining these triggers.
97196

197+
```python
198+
from gqlalchemy import Memgraph, MemgraphTrigger
199+
from gqlalchemy.models import (
200+
TriggerEventType,
201+
TriggerEventObject,
202+
TriggerExecutionPhase,
203+
)
98204

99-
class Follows(Relationship, type="FOLLOWS"):
100-
pass
205+
db = Memgraph()
101206

207+
trigger = MemgraphTrigger(
208+
name="ratings_trigger",
209+
event_type=TriggerEventType.CREATE,
210+
event_object=TriggerEventObject.NODE,
211+
execution_phase=TriggerExecutionPhase.AFTER,
212+
statement="UNWIND createdVertices AS node SET node.created_at = LocalDateTime()",
213+
)
102214

103-
u1 = User(id=1).save(memgraph)
104-
u2 = User(id=2).save(memgraph)
105-
r = Follows(_start_node_id=u1._id, _end_node_id=u2._id).save(memgraph)
106-
107-
result = list(
108-
match(memgraph.new_connection())
109-
.node(variable="a")
110-
.to(variable="r")
111-
.node(variable="b")
112-
.where("a.id", "=", u1.id)
113-
.or_where("b.id", "=", u2.id)
114-
.return_()
115-
.execute()
116-
)[0]
117-
118-
print(result["a"])
119-
print(result["b"])
120-
print(result["r"])
215+
db.create_trigger(trigger)
216+
triggers = db.get_triggers()
217+
print(triggers)
121218
```
219+
</details>
220+
221+
<details>
222+
<summary>💽 On-disk storage</summary>
223+
<br>
224+
225+
Since Memgraph is an in-memory graph database, the GQLAlchemy library provides an on-disk storage solution for large properties not used in graph algorithms. This is useful when nodes or relationships have metadata that doesn’t need to be used in any of the graph algorithms that need to be carried out in Memgraph, but can be fetched after. Learn all about it in the [on-disk storage how-to guide](https://memgraph.com/docs/gqlalchemy/how-to-guides/on-disk-storage).
226+
</details>
227+
228+
<br>
229+
230+
If you want to learn more about OGM, query builder, managing streams, importing data from different source, managing Memgraph instances, managing database triggers and using on-disk storage, check out the GQLAlchemy [how-to guides](https://memgraph.com/docs/gqlalchemy/how-to-guides).
122231

123232
## Development (how to build)
124233
```

0 commit comments

Comments
 (0)