Skip to content

Conversation

@amanwalia123
Copy link

@amanwalia123 amanwalia123 commented Oct 4, 2025

In this PR, I fixed an issue related to cross-equalization on one model I worked. Without this additional check, I am getting None Type object has no module called get_module. We need to check whether op.output_ops[0].model_module is not None. If anyone from QUIC team can review this, it would be great.

@staticmethod
    def does_module_have_relu_activation(
        connected_graph: ConnectedGraph, module: torch.nn.Module
    ) -> bool:
        """
        Finds if a given module has a ReLU activation
        :param connected_graph: Reference to ConnectedGraph instance
        :param module: PyTorch module to find activation for
        :return: True if module has a relu activation
        """

        for op in connected_graph.get_all_ops().values():
            if op.model_module and op.model_module.get_module() is module and op.output_ops[0].model_module is not None:
            # if op.model_module and op.model_module.get_module() is module:
                assert len(op.output_ops) == 1
                is_relu_activation = isinstance(
                    op.output_ops[0].model_module.get_module(),
                    (torch.nn.ReLU, torch.nn.PReLU),
                )
                return is_relu_activation

        return False

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant