'Cant load spacy en_core_web_trf

As the self guide says, I've installed it with (conda environment)

conda install -c conda-forge spacy
python -m spacy download en_core_web_trf

I have spacy-transformers already installed. But when I simply do:

import spacy
spacy.load("en_core_web_trf")

It shows me this error:

ValueError: [E002] Can't find factory for 'transformer' for language English (en). This usually happens when spaCy calls `nlp.create_pipe` with a custom component name that's not registered on the current language class. If you're using a Transformer, make sure to install 'spacy-transformers'. If you're using a custom component, make sure you've added the decorator `@Language.component` (for function components) or `@Language.factory` (for class components).

Available factories: attribute_ruler, tok2vec, merge_noun_chunks, merge_entities, merge_subtokens, token_splitter, parser, beam_parser, entity_linker, ner, beam_ner, entity_ruler, lemmatizer, tagger, morphologizer, senter, sentencizer, textcat, spancat, textcat_multilabel, en.lemmatizer

More info about the error:

ValueError                                Traceback (most recent call last)
~\AppData\Local\Temp/ipykernel_11108/2648447056.py in <module>
----> 1 nlp_en = spacy.load("en_core_web_trf")

~\Anaconda3\envs\rl\lib\site-packages\spacy\__init__.py in load(name, vocab, disable, exclude, config)
     49     RETURNS (Language): The loaded nlp object.
     50     """
---> 51     return util.load_model(
     52         name, vocab=vocab, disable=disable, exclude=exclude, config=config
     53     )

~\Anaconda3\envs\rl\lib\site-packages\spacy\util.py in load_model(name, vocab, disable, exclude, config)
    345             return get_lang_class(name.replace("blank:", ""))()
    346         if is_package(name):  # installed as package
--> 347             return load_model_from_package(name, **kwargs)
    348         if Path(name).exists():  # path to model data directory
    349             return load_model_from_path(Path(name), **kwargs)

~\Anaconda3\envs\rl\lib\site-packages\spacy\util.py in load_model_from_package(name, vocab, disable, exclude, config)
    378     """
    379     cls = importlib.import_module(name)
--> 380     return cls.load(vocab=vocab, disable=disable, exclude=exclude, config=config)
    381 
    382 

~\Anaconda3\envs\rl\lib\site-packages\en_core_web_trf\__init__.py in load(**overrides)
      8 
      9 def load(**overrides):
---> 10     return load_model_from_init_py(__file__, **overrides)

~\Anaconda3\envs\rl\lib\site-packages\spacy\util.py in load_model_from_init_py(init_file, vocab, disable, exclude, config)
    538     if not model_path.exists():
    539         raise IOError(Errors.E052.format(path=data_path))
--> 540     return load_model_from_path(
    541         data_path,
    542         vocab=vocab,

~\Anaconda3\envs\rl\lib\site-packages\spacy\util.py in load_model_from_path(model_path, meta, vocab, disable, exclude, config)
    413     overrides = dict_to_dot(config)
    414     config = load_config(config_path, overrides=overrides)
--> 415     nlp = load_model_from_config(config, vocab=vocab, disable=disable, exclude=exclude)
    416     return nlp.from_disk(model_path, exclude=exclude, overrides=overrides)
    417 

~\Anaconda3\envs\rl\lib\site-packages\spacy\util.py in load_model_from_config(config, vocab, disable, exclude, auto_fill, validate)
    450     # registry, including custom subclasses provided via entry points
    451     lang_cls = get_lang_class(nlp_config["lang"])
--> 452     nlp = lang_cls.from_config(
    453         config,
    454         vocab=vocab,

~\Anaconda3\envs\rl\lib\site-packages\spacy\language.py in from_config(cls, config, vocab, disable, exclude, meta, auto_fill, validate)
   1712                     # The pipe name (key in the config) here is the unique name
   1713                     # of the component, not necessarily the factory
-> 1714                     nlp.add_pipe(
   1715                         factory,
   1716                         name=pipe_name,

~\Anaconda3\envs\rl\lib\site-packages\spacy\language.py in add_pipe(self, factory_name, name, before, after, first, last, source, config, raw_config, validate)
    774                     lang_code=self.lang,
    775                 )
--> 776             pipe_component = self.create_pipe(
    777                 factory_name,
    778                 name=name,

~\Anaconda3\envs\rl\lib\site-packages\spacy\language.py in create_pipe(self, factory_name, name, config, raw_config, validate)
    639                 lang_code=self.lang,
    640             )
--> 641             raise ValueError(err)
    642         pipe_meta = self.get_factory_meta(factory_name)
    643         # This is unideal, but the alternative would mean you always need to


Solution 1:[1]

Are you sure you did install spacy-transformers? After installing spacy?

I am using pip: pip install spacy-transformers and I have no problems loading the en_core_web_trf.

Solution 2:[2]

# !pip install spacy
# !pip install spacy-transformers
# !python3 -m spacy download en_core_web_trf

It's working fine, if you are working in google colab, use this

# !pip install spacy==(last version of spacy(3.2.4)) 

Remaining all are same, colab downloading the older version. Hope you get the answers!

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Martin Brunecky
Solution 2 Aravind R