Git Product home page Git Product logo

pymlir's People

Contributors

amanda849 avatar berke-ates avatar blaine-fs avatar joker-eph avatar kaushikcfd avatar ro-i avatar tbennun avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pymlir's Issues

installation procedure

can i get the steps to install mlir python version
what are the dependencies i shouhld install

MLIR changed generic form

This simple mlir:

module  {
    llvm.func @mlir_entry(%a: i32, %b: i32) -> i32 {
        %0 = llvm.add %b, %a  : i32
        llvm.return %0 : i32
    }
}

yields the following generic form after executing mlir-opt --mlir-print-op-generic:

"module"() ( {
  "llvm.func"() ( {
  ^bb0(%arg0: i32, %arg1: i32):  // no predecessors
    %0 = "llvm.add"(%arg1, %arg0) : (i32, i32) -> i32
    "llvm.return"(%0) : (i32) -> ()
  }) {linkage = 10 : i64, sym_name = "mlir_entry", type = !llvm.func<i32 (i32, i32)>} : () -> ()
}) : () -> ()

However executing the following simple python script:

import mlir

ast1 = mlir.parse_path('out.mlir')

throws this:

lark.exceptions.UnexpectedCharacters: No terminal matches '"' in the current parser context, at line 1 col 1

"module"() ( {
^
Expected one of: 
        * MODULE
        * FUNC
        * BANG
        * HASH

I am assuming the MLIR team changed the generic form.

Support for cross referencing dialect node rules?

For example, say in the linalg dialect I need to parse:

    func @example(%A: memref<?x?xf64>, %B: memref<?x?xf64>, %C: memref<?x?xf64>) {
      linalg.generic #attrs ins(%A, %B: memref<?x?xf64>, memref<?x?xf64>) outs(%C: memref<?x?xf64>) {
      ^bb0(%a: f64, %b: f64, %c: f64):
        %d = addf %a, %b : f64
        linalg.yield %d : f64
      }
      return
    }

In this case I can parse the arguments to linalg.generic as:

class Ins(DialectOp):
    _syntax_ = "ins( {args.ssa_id_list} : {types.type_list_no_parens} )"

class Outs(DialectOp):
    _syntax_ = "outs( {args.ssa_id_list} : {types.type_list_no_parens} )"

but how do I cross reference the names to these rules while I am defining class LinalgGeneric.

Most of the test cases in the project cannot pass

simple as title.
FAILED tests/test_builder.py::test_query - lark.exceptions.VisitError: Error trying to process rule "float_type":
FAILED tests/test_custom_dialect.py::test_custom_dialect - lark.exceptions.VisitError: Error trying to process rule "float_type":
FAILED tests/test_linalg.py::test_batch_matmul - lark.exceptions.VisitError: Error trying to process rule "float_type":
FAILED tests/test_linalg.py::test_conv - lark.exceptions.VisitError: Error trying to process rule "float_type":
FAILED tests/test_linalg.py::test_copy - lark.exceptions.VisitError: Error trying to process rule "float_type":
FAILED tests/test_linalg.py::test_dot - lark.exceptions.VisitError: Error trying to process rule "float_type":
FAILED tests/test_linalg.py::test_fill - lark.exceptions.VisitError: Error trying to process rule "float_type":
FAILED tests/test_linalg.py::test_generic - lark.exceptions.VisitError: Error trying to process rule "float_type":
FAILED tests/test_linalg.py::test_indexed_generic - lark.exceptions.VisitError: Error trying to process rule "float_type":
FAILED tests/test_linalg.py::test_view - lark.exceptions.VisitError: Error trying to process rule "float_type":
FAILED tests/test_linalg.py::test_matmul - lark.exceptions.VisitError: Error trying to process rule "float_type":
FAILED tests/test_linalg.py::test_matvec - lark.exceptions.VisitError: Error trying to process rule "float_type":
FAILED tests/test_roundtrip.py::test_toy_roundtrip - lark.exceptions.VisitError: Error trying to process rule "float_type":
FAILED tests/test_roundtrip.py::test_loop_dialect_roundtrip - lark.exceptions.VisitError: Error trying to process rule "float_type":
FAILED tests/test_syntax.py::test_attributes - lark.exceptions.VisitError: Error trying to process rule "float_type":
FAILED tests/test_syntax.py::test_memrefs - lark.exceptions.VisitError: Error trying to process rule "float_type":
FAILED tests/test_syntax.py::test_trailing_loc - lark.exceptions.VisitError: Error trying to process rule "float_type":
FAILED tests/test_syntax.py::test_functions - lark.exceptions.VisitError: Error trying to process rule "float_type":
FAILED tests/test_syntax.py::test_toplevel_function - lark.exceptions.VisitError: Error trying to process rule "float_type":
FAILED tests/test_syntax.py::test_toplevel_functions - lark.exceptions.VisitError: Error trying to process rule "float_type":
FAILED tests/test_syntax.py::test_affine - lark.exceptions.VisitError: Error trying to process rule "float_type":
FAILED tests/test_syntax.py::test_definitions - lark.exceptions.VisitError: Error trying to process rule "float_type":
FAILED tests/test_toy.py::test_toy_simple - lark.exceptions.VisitError: Error trying to process rule "float_type":
FAILED tests/test_toy.py::test_toy_advanced - lark.exceptions.VisitError: Error trying to process rule "float_type":
FAILED tests/test_visitors.py::test_visitor - lark.exceptions.VisitError: Error trying to process rule "float_type":
FAILED tests/test_visitors.py::test_transformer - lark.exceptions.VisitError: Error trying to process rule "float_type":
===================================================== 26 failed, 10 passed, 2 warnings in 15.65s ======================================================

Parse error in onnx mlir

Model:

module  {
  func @main_graph(%arg0: tensor<?x1x5x5xf32>) -> tensor<?x1x3x3xf32> attributes {input_names = ["conv2d_input"], output_names = ["conv2d"]} {
    %cst = constant unit
    %0 = "onnx.Constant"() {value = dense<[[[[-0.184168547, 0.380746067, -0.299985915], [0.182676792, 0.086410582, 0.571638823], [-0.120835036, 0.374882519, 0.118288457]]]]> : tensor<1x1x3x3xf32>} : () -> tensor<1x1x3x3xf32>
    %1 = "onnx.Conv"(%arg0, %0, %cst) {auto_pad = "NOTSET", dilations = [1, 1], group = 1 : si64, kernel_shape = [3, 3], onnx_node_name = "StatefulPartitionedCall/sequential/conv2d/Conv2D", pads = [0, 0, 0, 0], strides = [1, 1]} : (tensor<?x1x5x5xf32>, tensor<1x1x3x3xf32>, none) -> tensor<?x1x3x3xf32>
    return %1 : tensor<?x1x3x3xf32>
  }
  "onnx.EntryPoint"() {func = @main_graph, numInputs = 1 : i32, numOutputs = 1 : i32, signature = "[    { \22type\22 : \22float\22 , \22dims\22 : [-1 , 1 , 5 , 5]  }\0A\0A]\00@[   { \22type\22 : \22float\22 , \22dims\22 : [-1 , 1 , 3 , 3]  }\0A\0A]\00"} : () -> ()
}

Traceback:

Traceback (most recent call last):
  File "gpxResultValidator.py", line 1426, in <module>
    format_output_and_print(*main(model_mlir, input_path), wtm_width)
  File "gpxResultValidator.py", line 1310, in main
    m = mlir.parse_path(model_mlir)
  File "/home/vasantha/.local/lib/python3.8/site-packages/mlir/parser.py", line 159, in parse_path
    return parse_file(fp, dialects)
  File "/home/vasantha/.local/lib/python3.8/site-packages/mlir/parser.py", line 146, in parse_file
    return parse_string(file.read(), dialects)
  File "/home/vasantha/.local/lib/python3.8/site-packages/mlir/parser.py", line 133, in parse_string
    return parser.parse(code)
  File "/home/vasantha/.local/lib/python3.8/site-packages/mlir/parser.py", line 93, in parse
    tree = self.parser.parse(code)
  File "/home/vasantha/.local/lib/python3.8/site-packages/lark/lark.py", line 581, in parse
    return self.parser.parse(text, start=start, on_error=on_error)
  File "/home/vasantha/.local/lib/python3.8/site-packages/lark/parser_frontends.py", line 106, in parse
    return self.parser.parse(stream, chosen_start, **kw)
  File "/home/vasantha/.local/lib/python3.8/site-packages/lark/parsers/earley.py", line 297, in parse
    to_scan = self._parse(lexer, columns, to_scan, start_symbol)
  File "/home/vasantha/.local/lib/python3.8/site-packages/lark/parsers/xearley.py", line 144, in _parse
    to_scan = scan(i, to_scan)
  File "/home/vasantha/.local/lib/python3.8/site-packages/lark/parsers/xearley.py", line 118, in scan
    raise UnexpectedCharacters(stream, i, text_line, text_column, {item.expect.name for item in to_scan},
lark.exceptions.UnexpectedCharacters: No terminal matches '%' in the current parser context, at line 4 col 5

    %0 = "onnx.Constant"() {value = dense<[[
    ^
Expected one of: 
	* __ANON_4
	* __ANON_0
	* DOT
	* __ANON_7
	* __ANON_8

Hi, I'm trying to parse the mlir file mentioned above, I think the line %cst = constant unit is not getting parsed properly. I don't think this is an issue with the mlir file's syntax as I was able to compile the mlir file with onnx-mlir. I saw in issue #19 that we can bypass this by adding a new dialect. Can you give some pointers on how to fix this in either pymlir or how to create a dialect for this?

Thank you,
Vasanth.

dump_json()

I'm interested in being able to save MLIR AST in a JSON format, to make it easier to parse in other tools.

Is this something that might be worthwhile to others?

Tutorial Error: lark.exceptions.VisitError: Error trying to process rule "float_type"

I installed pymlir via pip3 install ./ in the directory that has setup.py.
When attempting to follow the tutorial, I get the following error:

Python 3.8.10 (default, May 26 2023, 14:05:08) 
[GCC 9.4.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import mlir
>>> ast1 = mlir.parse_path('toy.mlir')
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 47, in _call_userfunc
    return f(children)
  File "/usr/local/lib/python3.8/dist-packages/mlir/astnodes.py", line 17, in from_lark
    assert not any(isinstance(el, (Token, Tree)) for el in args)
AssertionError

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/lib/python3.8/dist-packages/mlir/parser.py", line 159, in parse_path
    return parse_file(fp, dialects)
  File "/usr/local/lib/python3.8/dist-packages/mlir/parser.py", line 146, in parse_file
    return parse_string(file.read(), dialects)
  File "/usr/local/lib/python3.8/dist-packages/mlir/parser.py", line 133, in parse_string
    return parser.parse(code)
  File "/usr/local/lib/python3.8/dist-packages/mlir/parser.py", line 96, in parse
    root_node = self.transformer.transform(tree)
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 84, in transform
    return self._transform_tree(tree)
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 80, in _transform_tree
    children = list(self._transform_children(tree.children))
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 71, in _transform_children
    yield self._transform_tree(c)
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 80, in _transform_tree
    children = list(self._transform_children(tree.children))
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 71, in _transform_children
    yield self._transform_tree(c)
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 80, in _transform_tree
    children = list(self._transform_children(tree.children))
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 71, in _transform_children
    yield self._transform_tree(c)
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 80, in _transform_tree
    children = list(self._transform_children(tree.children))
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 71, in _transform_children
    yield self._transform_tree(c)
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 80, in _transform_tree
    children = list(self._transform_children(tree.children))
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 71, in _transform_children
    yield self._transform_tree(c)
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 80, in _transform_tree
    children = list(self._transform_children(tree.children))
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 71, in _transform_children
    yield self._transform_tree(c)
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 80, in _transform_tree
    children = list(self._transform_children(tree.children))
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 71, in _transform_children
    yield self._transform_tree(c)
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 80, in _transform_tree
    children = list(self._transform_children(tree.children))
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 71, in _transform_children
    yield self._transform_tree(c)
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 80, in _transform_tree
    children = list(self._transform_children(tree.children))
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 71, in _transform_children
    yield self._transform_tree(c)
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 80, in _transform_tree
    children = list(self._transform_children(tree.children))
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 71, in _transform_children
    yield self._transform_tree(c)
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 80, in _transform_tree
    children = list(self._transform_children(tree.children))
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 71, in _transform_children
    yield self._transform_tree(c)
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 80, in _transform_tree
    children = list(self._transform_children(tree.children))
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 71, in _transform_children
    yield self._transform_tree(c)
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 80, in _transform_tree
    children = list(self._transform_children(tree.children))
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 71, in _transform_children
    yield self._transform_tree(c)
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 80, in _transform_tree
    children = list(self._transform_children(tree.children))
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 71, in _transform_children
    yield self._transform_tree(c)
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 80, in _transform_tree
    children = list(self._transform_children(tree.children))
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 71, in _transform_children
    yield self._transform_tree(c)
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 80, in _transform_tree
    children = list(self._transform_children(tree.children))
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 71, in _transform_children
    yield self._transform_tree(c)
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 80, in _transform_tree
    children = list(self._transform_children(tree.children))
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 71, in _transform_children
    yield self._transform_tree(c)
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 80, in _transform_tree
    children = list(self._transform_children(tree.children))
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 71, in _transform_children
    yield self._transform_tree(c)
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 81, in _transform_tree
    return self._call_userfunc(tree, children)
  File "/usr/local/lib/python3.8/dist-packages/lark/visitors.py", line 51, in _call_userfunc
    raise VisitError(tree, e)
lark.exceptions.VisitError: Error trying to process rule "float_type":

Platform:

$ lsb_release -a
No LSB modules are available.
Distributor ID:	Ubuntu
Description:	Ubuntu 20.04.6 LTS
Release:	20.04
Codename:	focal

$ python3
Python 3.8.10 (default, May 26 2023, 14:05:08) 
[GCC 9.4.0] on linux
>>> import lark
>>> lark.__version__
'0.7.8'
>>> import parse
>>> parse.__version__
'1.14.0'

Problem with custom dialects

Thanks for this great package!
I was trying to run the dialect example and got the following errors, I am not sure why.
code is:
`import mlir
from typing import Union
from mlir.dialect import Dialect, DialectType
from dataclasses import dataclass
import mlir.astnodes as mast

@DataClass
class RaggedTensorType(DialectType):
""" AST node class for the example "toy" dialect representing a ragged tensor. """
implementation: mast.StringLiteral
dims: list[mast.Dimension]
type: Union[mast.TensorType, mast.MemRefType]
syntax = ('toy.ragged < {implementation.string_literal} , {dims.dimension_list_ranked} '
'{type.tensor_memref_element_type} >')

pphlo_dialect = Dialect("pphlo dialect", types=[RaggedTensorType])

ast3 = mlir.parse_string('''
module {
func.func @toy_func(%tensor: tensor<2x3xf64>) -> tensor<3x2xf64> {
%t_tensor = "toy.transpose"(%tensor) { inplace = true } : (tensor<2x3xf64>) -> tensor<3x2xf64>
return %t_tensor : tensor<3x2xf64>
}
}
''', dialects=[pphlo_dialect])
print(ast3)
`

the error message:

Traceback (most recent call last):
File "/home/pyz/installations/anaconda3/envs/spu/lib/python3.10/site-packages/lark/parsers/lalr_parser.py", line 126, in feed_token
action, arg = states[state][token.type]
KeyError: 'RULE'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/pyz/2024/mpc_engine/spu/demo/mlir_parse.py", line 141, in
mlir_parse()
File "/home/pyz/2024/mpc_engine/spu/demo/mlir_parse.py", line 136, in mlir_parse
ast = mlir.parse_file(open("./temp.txt", "r"), dialects=[pphlo_dialect])
File "/home/pyz/installations/anaconda3/envs/spu/lib/python3.10/site-packages/mlir/parser.py", line 146, in parse_file
return parse_string(file.read(), dialects)
File "/home/pyz/installations/anaconda3/envs/spu/lib/python3.10/site-packages/mlir/parser.py", line 132, in parse_string
parser = Parser(dialects)
File "/home/pyz/installations/anaconda3/envs/spu/lib/python3.10/site-packages/mlir/parser.py", line 66, in init
self.parser = Lark(parser_src, parser='earley')
File "/home/pyz/installations/anaconda3/envs/spu/lib/python3.10/site-packages/lark/lark.py", line 300, in init
self.grammar, used_files = load_grammar(grammar, self.source_path, self.options.import_paths, self.options.keep_all_tokens)
File "/home/pyz/installations/anaconda3/envs/spu/lib/python3.10/site-packages/lark/load_grammar.py", line 1352, in load_grammar
builder.load_grammar(grammar, source)
File "/home/pyz/installations/anaconda3/envs/spu/lib/python3.10/site-packages/lark/load_grammar.py", line 1185, in load_grammar
tree = _parse_grammar(grammar_text, grammar_name)
File "/home/pyz/installations/anaconda3/envs/spu/lib/python3.10/site-packages/lark/load_grammar.py", line 952, in _parse_grammar
tree = _get_parser().parse(text + '\n', start)
File "/home/pyz/installations/anaconda3/envs/spu/lib/python3.10/site-packages/lark/parser_frontends.py", line 106, in parse
return self.parser.parse(stream, chosen_start, **kw)
File "/home/pyz/installations/anaconda3/envs/spu/lib/python3.10/site-packages/lark/parsers/lalr_parser.py", line 41, in parse
return self.parser.parse(lexer, start)
File "/home/pyz/installations/anaconda3/envs/spu/lib/python3.10/site-packages/lark/parsers/lalr_parser.py", line 171, in parse
return self.parse_from_state(parser_state)
File "/home/pyz/installations/anaconda3/envs/spu/lib/python3.10/site-packages/lark/parsers/lalr_parser.py", line 188, in parse_from_state
raise e
File "/home/pyz/installations/anaconda3/envs/spu/lib/python3.10/site-packages/lark/parsers/lalr_parser.py", line 179, in parse_from_state
state.feed_token(token)
File "/home/pyz/installations/anaconda3/envs/spu/lib/python3.10/site-packages/lark/parsers/lalr_parser.py", line 129, in feed_token
raise UnexpectedToken(token, expected, state=self, interactive_parser=None)
lark.exceptions.UnexpectedToken: Unexpected token Token('RULE', 'dialect_type_secrettensortype_0') at line 453, column 7.
Expected one of:
* _LBRACE
* _COLON
* _DOT

The code is basically a modification of the example. I added the ragged tensor type to see how to use dialect types in parsing. But it won't work even when parsing the original string.

Pip install installs pymlir==0.4 which seems broken, whereas in repo setup.py works with pymlir==0.3

Hi, just tried pip install pymlir and running the examples from this repo, specifically

import mlir
ast3 = mlir.parse_string('''
module {
  func.func @toy_func(%tensor: tensor<2x3xf64>) -> tensor<3x2xf64> {
    %t_tensor = "toy.transpose"(%tensor) { inplace = true } : (tensor<2x3xf64>) -> tensor<3x2xf64>
    return %t_tensor : tensor<3x2xf64>
  }
}
''')
print(ast3)

It throws this lovely stacktrace:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/ed/.pyenv/versions/pymlir/lib/python3.11/site-packages/mlir/parser.py", line 133, in parse_string
    return parser.parse(code)
           ^^^^^^^^^^^^^^^^^^
  File "/Users/ed/.pyenv/versions/pymlir/lib/python3.11/site-packages/mlir/parser.py", line 93, in parse
    tree = self.parser.parse(code)
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ed/.pyenv/versions/pymlir/lib/python3.11/site-packages/lark/lark.py", line 311, in parse
    return self.parser.parse(text, start=start)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ed/.pyenv/versions/pymlir/lib/python3.11/site-packages/lark/parser_frontends.py", line 185, in parse
    return self._parse(text, start)
           ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ed/.pyenv/versions/pymlir/lib/python3.11/site-packages/lark/parser_frontends.py", line 54, in _parse
    return self.parser.parse(input, start, *args)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ed/.pyenv/versions/pymlir/lib/python3.11/site-packages/lark/parsers/earley.py", line 292, in parse
    to_scan = self._parse(stream, columns, to_scan, start_symbol)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ed/.pyenv/versions/pymlir/lib/python3.11/site-packages/lark/parsers/xearley.py", line 137, in _parse
    to_scan = scan(i, to_scan)
              ^^^^^^^^^^^^^^^^
  File "/Users/ed/.pyenv/versions/pymlir/lib/python3.11/site-packages/lark/parsers/xearley.py", line 114, in scan
    raise UnexpectedCharacters(stream, i, text_line, text_column, {item.expect.name for item in to_scan}, set(to_scan))
lark.exceptions.UnexpectedCharacters: No terminal defined for '@' at line 3 col 13

  func.func @toy_func(%tensor: tensor<2x3xf64>) -> t
            ^

Expecting: {'DOT', '__ANON_10', '__ANON_9', 'ESCAPED_STRING', '__ANON_4', 'PERCENT', 'TRUE', '__ANON_1', '__ANON_8', '__ANON_7', 'FALSE', '__ANON_0', 'COLON'}

However cloning the repo and running it in the repo/doing a pip install . works as expected. Just wanted to flag this.

Thanks for the work!

Cannot parse `func.func`

Hi,

First of all, thank you for the excellent package.

I think I ran into a corner case with your parser logic. The rule for func does not seem to recognize func.func. This is supported in the most recent versions of MLIR, as func is now an operator of the func dialect. (In fact, I believe bare func no longer works in MLIR.) See this test case:

import mlir

ast3 = mlir.parse_string('''
module {
  func.func @toy_func(%tensor: tensor<2x3xf64>) -> tensor<3x2xf64> {
    %t_tensor = "toy.transpose"(%tensor) { inplace = true } : (tensor<2x3xf64>) -> tensor<3x2xf64>
    return %t_tensor : tensor<3x2xf64>
  }
}
''')

On the latest version I get:

  File "/venv/lib/python3.10/site-packages/mlir/parser.py", line 133, in parse_string
    return parser.parse(code)
  File "/venv/lib/python3.10/site-packages/mlir/parser.py", line 93, in parse
    tree = self.parser.parse(code)
  File "/venv/lib/python3.10/site-packages/lark/lark.py", line 581, in parse
    return self.parser.parse(text, start=start, on_error=on_error)
  File "/venv/lib/python3.10/site-packages/lark/parser_frontends.py", line 106, in parse
    return self.parser.parse(stream, chosen_start, **kw)
  File "/venv/lib/python3.10/site-packages/lark/parsers/earley.py", line 297, in parse
    to_scan = self._parse(lexer, columns, to_scan, start_symbol)
  File "/venv/lib/python3.10/site-packages/lark/parsers/xearley.py", line 144, in _parse
    to_scan = scan(i, to_scan)
  File "/venv/lib/python3.10/site-packages/lark/parsers/xearley.py", line 118, in scan
    raise UnexpectedCharacters(stream, i, text_line, text_column, {item.expect.name for item in to_scan},
lark.exceptions.UnexpectedCharacters: No terminal matches '@' in the current parser context, at line 3 col 13

  func.func @toy_func(%tensor: tensor<2x3xf64>) -> t
            ^
Expected one of: 
        * ESCAPED_STRING
        * __ANON_4
        * PERCENT
        * COLON
        * __ANON_8
        * __ANON_10
        * DOT
        * __ANON_1
        * __ANON_7
        * __ANON_9
        * FALSE
        * __ANON_0
        * TRUE

I think this should be a simple fix. Happy to contribute a PR.

Help to get started on custom dialect parsing

Hi,

I am trying to use pymlir to parse an mlir file generated with custom dialects. Is there any additional documentation on how to construct ops / types / attributes for custom dialects using pymlir. Is there any other open source project which has achieved this which I can use as a reference to get started? Any help is really appreciated. Thanks in advance.

Improve parser speed

I am currently using pymlir to parse an mlir file with quite a bit of custom dialects used in the mlir file. A file ~2200 lines takes about 5 mins to parse and generate an MLIRFile object when i use the parse_string command. Are there any suggestions or strategies to improve the parser speed?

Conditional branches in generic form

The following MLIR:

module  {
    func @mlir_entry(%a: i32) -> i32 {
        %1 = constant 1 : i32

        %isOne = cmpi "sle", %a, %1 : i32
        cond_br %isOne, ^one, ^else

        ^one: 
            return %1 : i32

        ^else: 
            return %1 : i32
    }
}

results in the following generic form:

// out.mlir
"module"() ( {
  "func"() ( {
  ^bb0(%arg0: i32):  // no predecessors
    %c1_i32 = "std.constant"() {value = 1 : i32} : () -> i32
    %0 = "std.cmpi"(%arg0, %c1_i32) {predicate = 3 : i64} : (i32, i32) -> i1
    "std.cond_br"(%0)[^bb1, ^bb2] {operand_segment_sizes = dense<[1, 0, 0]> : vector<3xi32>} : (i1) -> ()
  ^bb1:  // pred: ^bb0
    "std.return"(%c1_i32) : (i32) -> ()
  ^bb2:  // pred: ^bb0
    "std.return"(%c1_i32) : (i32) -> ()
  }) {sym_name = "mlir_entry", type = (i32) -> i32} : () -> ()
}) : () -> ()

Executing:

import mlir
ast1 = mlir.parse_path('out.mlir')

throws this exception:

lark.exceptions.UnexpectedCharacters: No terminal matches '[' in the current parser context, at line 6 col 22

    "std.cond_br"(%0)[^bb1, ^bb2] {operand_segment_sizes = de
                     ^
Expected one of: 
        * LBRACE
        * COLON

Failed round-trip for a function with no arguments

Hi,

It seems like pymlir omits dumping the () after a function's name if the function has no arguments. See this test which fails on the current master:

 from mlir import parse_string

 def test_function_no_args():
     """
     Test round-tripping a function with no arguments.
     """
     code = '''module {
  func @toy_func() -> index {
    %0 = constant 0 : index
    return %0 : index
  }
}'''
 
     module = parse_string(code)
     dump = module.dump()
     assert dump == code

This fails with error:

E AssertionError: assert 'module {\n ...index\n }\n}' == 'module {\n ...index\n }\n}'
E module {
E - func @toy_func() -> index {
E ? --
E + func @toy_func -> index {
E %c0 = constant 0 : index
E return %0 : index
E }...
E
E ...Full output truncated (2 lines hidden), use '-vv' to show

I think this is a one-line fix. Happy to contribute a PR.

Signed and unsigned integers

MLIR allows for signed integers like si64 or si32 as well as unsigned ones like ui64 or ui32 as seen on their website: MLIR IntegerType. However this simple example:
hw.mlir:

module  {
    func @mlir_entry(%a: si32) -> si32 {
        return %a : si32
    }
}

main.py:

import mlir
ast1 = mlir.parse_path('hw.mlir')

throws an exception: lark.exceptions.UnexpectedCharacters. The same happens if one replaces si with ui. A quick check shows that these keywords carry over to the generic form as well.

parse .td files to dialects

Hi, I wonder if it is possible that pymlir takes .td files as input and parse them to Dialects? So that I don't have to hand write the python Dialects.
Thanks!

No dialect specification in generated MLIR

The generated MLIR does not have dialect specification. For example, tests/test_builder.py generates instructions without dialect (as shown below):

func -> should be -> func.func
constant -> should be -> arith.constan
t

This causes issues when I use this output with llvm tools (mlir-opt or mlir-translate). They complain about not recognizing "func" or "constant".

module {
  func @saxpy(%_pymlir_fnarg: f64, %_pymlir_fnarg_0: memref<?xf64>, %_pymlir_fnarg_1: memref<?xf64>) {
    %_pymlir_ssa = constant 0 : index
    %_pymlir_ssa_0 = dim %_pymlir_fnarg_0 , %_pymlir_ssa : index
    affine.for %_pymlir_i = 0 to %_pymlir_ssa_0 {
      %_pymlir_ssa_1 = affine.load %_pymlir_fnarg_0 [ (%_pymlir_i) ] : memref<?xf64>
      %_pymlir_ssa_2 = mulf %_pymlir_ssa_1 , %_pymlir_fnarg : f64
      %_pymlir_ssa_3 = affine.load %_pymlir_fnarg_1 [ (%_pymlir_i) ] : memref<?xf64>
      %_pymlir_ssa_4 = addf %_pymlir_ssa_3 , %_pymlir_ssa_2 : f64
      affine.store %_pymlir_ssa_4 , %_pymlir_fnarg_1 [ (%_pymlir_i) ] : memref<?xf64>
    }
    return
  }
}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.