Category "semantic-segmentation"

How to connect broken lines into one continuous line?

I have this picture: As you can see, some lines are not connected. These lines were drawn using semantic segmentation. What I want is to connect these lines, b

Implementing Multiclass Dice Loss Function

I am doing multi class segmentation using UNet. My input to the model is HxWxC and my output is, outputs = layers.Conv2D(n_classes, (1, 1), activation='sigmoid'

Negative gradients when calculating GradCAM heatmap

I have a Segmentation network model trained for 2 classes and am able to see accurate results. But when using grad-cam for the heatmap, I am able to see good re

Why is there a difference in Intersection over Union (IoU) calculation while evaluating for same data using same model?

I evaluated the IoU score for the test dataset using the saved model. (model.evaluate(test_gen, steps) Also, I have calculated the IoU score for each image in

Multiclass semantic segmentation model evaluation

I am doing a project on multiclass semantic segmentation. I have formulated a model that outputs pretty descent segmented images by decreasing the loss value. H

How is the smooth dice loss differentiable?

I am training a U-Net in keras by minimizing the dice_loss function that is popularly used for this problem: adapted from here and here def dsc(y_true, y_pred)

AttributeError: module 'keras.utils' has no attribute 'get_file'

When I'm trying to implement the following code from keras_segmentation.models.segnet import resnet50_segnet from keras_segmentation.predict import model_from_c

Runtime Error - element 0 of tensors does not require grad and does not have a grad_fn

I am using a Unet model for semantic segmentation - I have a custom dataset of images and their masks both in .png format. I have looked in the online forums an

Keras loss is NaN when training for semantic segmentation

I am using the headsegmentation dataset. A single mask looks like this All mask images are a single channel. This is my code: image_size = 512 batch = 4 labels