when i try to use featuretools[spark] on pyspark dataframe my code are bellow:
import featuretools as ft
import pyspark.pandas as ps
from woodwork.logical_types import Double, Integer
ps.set_option("compute.default_index_type", "distributed")
id = [0, 1, 2, 3, 4]
values = [12, -35, 14, 103, -51]
spark_df = ps.DataFrame({"id": id, "values": values})
es = ft.EntitySet(id="spark_es")
es = es.add_dataframe(
dataframe_name="spark_input_df",
dataframe=spark_df,
index="id",
logical_types={"id": Integer, "values": Double},
)
es
bug got an error "AttributeError: 'DataFrame' object has no attribute 'ww'"
anyone can help me ?
I just run the official code post on "https://featuretools.alteryx.com/en/stable/guides/using_spark_entitysets.html"
This code you provided works for me.
The lib versions: