ChatGPT解决这个技术问题 Extra ChatGPT

Use Hex color in SwiftUI

in UIKit we could use an Extension to set hex color to almost everything. https://www.hackingwithswift.com/example-code/uicolor/how-to-convert-a-hex-color-to-a-uicolor

but when I'm trying to do it on SwiftUI, it's not possible, it looks like the SwiftUI does not get the UIColor as parameter.

    Text(text)
        .color(UIColor.init(hex: "FFF"))

error message:

Cannot convert value of type 'UIColor' to expected argument type 'Color?'

I even tried to make an extension for Color, instead of UIColor, but I haven't any luck

my extension for Color:

import SwiftUI

extension Color {
    init(hex: String) {
        let scanner = Scanner(string: hex)
        scanner.scanLocation = 0
        var rgbValue: UInt64 = 0
        scanner.scanHexInt64(&rgbValue)

        let r = (rgbValue & 0xff0000) >> 16
        let g = (rgbValue & 0xff00) >> 8
        let b = rgbValue & 0xff

        self.init(
            red: CGFloat(r) / 0xff,
            green: CGFloat(g) / 0xff,
            blue: CGFloat(b) / 0xff, alpha: 1
        )
    }
}

error message:

Incorrect argument labels in call (have 'red:green:blue:alpha:', expected '_:red:green:blue:opacity:')
The init is this one: developer.apple.com/documentation/swiftui/color/3265484-init It's missing a parameter, as you can see it in your error message: 'red:green:blue:alpha:' vs '_:red:green:blue:opacity:, see the _: at the start which is for the _ colorSpace: and opacity vs alpha.
@Larme yes I tried that, it fixed the compile error, but nothing in result, it does not set the color to the view, did you solve it for yourself? If you do please add the code.

P
P1xelfehler

You're almost there, you were using the wrong initialiser parameter:

extension Color {
    init(hex: String) {
        let hex = hex.trimmingCharacters(in: CharacterSet.alphanumerics.inverted)
        var int: UInt64 = 0
        Scanner(string: hex).scanHexInt64(&int)
        let a, r, g, b: UInt64
        switch hex.count {
        case 3: // RGB (12-bit)
            (a, r, g, b) = (255, (int >> 8) * 17, (int >> 4 & 0xF) * 17, (int & 0xF) * 17)
        case 6: // RGB (24-bit)
            (a, r, g, b) = (255, int >> 16, int >> 8 & 0xFF, int & 0xFF)
        case 8: // ARGB (32-bit)
            (a, r, g, b) = (int >> 24, int >> 16 & 0xFF, int >> 8 & 0xFF, int & 0xFF)
        default:
            (a, r, g, b) = (1, 1, 1, 0)
        }

        self.init(
            .sRGB,
            red: Double(r) / 255,
            green: Double(g) / 255,
            blue:  Double(b) / 255,
            opacity: Double(a) / 255
        )
    }
}

it solved the compile error, thanks, but it did not set the color the views in SwiftUI, no error but no result
I tried with Color("ff00ff") and worked fine. What are you passing as hex?
Please also indicate what color do you get for a specific hex parameter.
Your solution doesn't work for '#hexColorStr'. Please use my one: stackoverflow.com/questions/36341358/…
This fails: XCTAssertEqual(Color(hex: "0xFFFFFF"), Color(red: 255, green: 255, blue: 255)). along with "ffffff" and "FFFFFF"
S
Sam Soffes

Another alternative below that uses Int for hex but of course, it can be changed to String if you prefer that.

extension Color {
    init(hex: UInt, alpha: Double = 1) {
        self.init(
            .sRGB,
            red: Double((hex >> 16) & 0xff) / 255,
            green: Double((hex >> 08) & 0xff) / 255,
            blue: Double((hex >> 00) & 0xff) / 255,
            opacity: alpha
        )
    }
}

Usage examples:

Color(hex: 0x000000)
Color(hex: 0x000000, alpha: 0.2)

That's a good implementation! How would you use String instead of Int?
For anyone interested, this approach is explained (in a general context not related to SwiftUI) in The Swift Programming Language book, in Advanced Operators. The whole chapter is worth reading. TIP: The key to understanding is the right shift and bitwise AND, and the simplest examples are 1. halving a number using right shift (number >> 1) and 2. checking whether the number odd (number & 0x1 == 1).The Bitwise_operation Wikipedia article is worth reading as well.
Why did you create a tuple only to extract all of its values in the very next statement? It doesn't make sense.
@PeterSchorn yeah makes sense, I removed the tuple. Thanks!
@TolgahanArıkan No problem. Glad I could help.
S
Stefan

Here is a Playground with my solution. It adds fallbacks after fallbacks and only relies on the hexString for color and alpha.

import SwiftUI

extension Color {
    init(hex string: String) {
        var string: String = string.trimmingCharacters(in: CharacterSet.whitespacesAndNewlines)
        if string.hasPrefix("#") {
            _ = string.removeFirst()
        }

        // Double the last value if incomplete hex
        if !string.count.isMultiple(of: 2), let last = string.last {
            string.append(last)
        }

        // Fix invalid values
        if string.count > 8 {
            string = String(string.prefix(8))
        }

        // Scanner creation
        let scanner = Scanner(string: string)

        var color: UInt64 = 0
        scanner.scanHexInt64(&color)

        if string.count == 2 {
            let mask = 0xFF

            let g = Int(color) & mask

            let gray = Double(g) / 255.0

            self.init(.sRGB, red: gray, green: gray, blue: gray, opacity: 1)

        } else if string.count == 4 {
            let mask = 0x00FF

            let g = Int(color >> 8) & mask
            let a = Int(color) & mask

            let gray = Double(g) / 255.0
            let alpha = Double(a) / 255.0

            self.init(.sRGB, red: gray, green: gray, blue: gray, opacity: alpha)

        } else if string.count == 6 {
            let mask = 0x0000FF
            let r = Int(color >> 16) & mask
            let g = Int(color >> 8) & mask
            let b = Int(color) & mask

            let red = Double(r) / 255.0
            let green = Double(g) / 255.0
            let blue = Double(b) / 255.0

            self.init(.sRGB, red: red, green: green, blue: blue, opacity: 1)

        } else if string.count == 8 {
            let mask = 0x000000FF
            let r = Int(color >> 24) & mask
            let g = Int(color >> 16) & mask
            let b = Int(color >> 8) & mask
            let a = Int(color) & mask

            let red = Double(r) / 255.0
            let green = Double(g) / 255.0
            let blue = Double(b) / 255.0
            let alpha = Double(a) / 255.0

            self.init(.sRGB, red: red, green: green, blue: blue, opacity: alpha)

        } else {
            self.init(.sRGB, red: 1, green: 1, blue: 1, opacity: 1)
        }
    }
}

let gray0 = Color(hex: "3f")
let gray1 = Color(hex: "#69")
let gray2 = Color(hex: "#6911")
let gray3 = Color(hex: "fff")
let red = Color(hex: "#FF000044s")
let green = Color(hex: "#00FF00")
let blue0 = Color(hex: "0000FF")
let blue1 = Color(hex: "0000F")

For getting the hexString from Color.. well, this is not a public API. We still need to rely on UIColor implementations for that.

PS: I saw the components solution below.. but if the API changes in the future, my version is a bit more stable.


This is the best answer here. If you add opacity as a param, it will be the most completed one.
Agreed, I've tried numerous solutions and this is the best answer here. I don't know why it's not upvoted more than that. Props @Stefan ! As for the opacity, just chain it like you would normally do in SwiftUI... Color(hex: "#003366").opacity(0.2)
N
Nico S.

try this

extension Color {
    init(hex: Int, opacity: Double = 1.0) {
        let red = Double((hex & 0xff0000) >> 16) / 255.0
        let green = Double((hex & 0xff00) >> 8) / 255.0
        let blue = Double((hex & 0xff) >> 0) / 255.0
        self.init(.sRGB, red: red, green: green, blue: blue, opacity: opacity)
    }
}

Use

Text("Hello World!")
            .background(Color(hex: 0xf5bc53))

Text("Hello World!")
            .background(Color(hex: 0xf5bc53, opacity: 0.8))

I like this solution, simple, short and elegant
But slow for compilation :) 9 sec from scratch :)
F
Fatemeh
extension Color {
  init(_ hex: UInt, alpha: Double = 1) {
    self.init(
      .sRGB,
      red: Double((hex >> 16) & 0xFF) / 255,
      green: Double((hex >> 8) & 0xFF) / 255,
      blue: Double(hex & 0xFF) / 255,
      opacity: alpha
    )
  }
}

Then, you can use it like this:

let red = Color(0xFF0000)
let green = Color(0x00FF00)
let translucentMagenta = Color(0xFF00FF, alpha: 0.4)

The second extension allows for building a color from a hex string, covering most known formats. It allows for:

Specifying color with or without leading #. 2-digit format for shades of gray. 3-digit format for shorthand 6-digit format. 4-digit format for gray with alpha. 6-digit format for RGB. 8-digit format for RGBA. Automatically returns nil for all invalid formats.

extension Color {
  init?(_ hex: String) {
    var str = hex
    if str.hasPrefix("#") {
      str.removeFirst()
    }
    if str.count == 3 {
      str = String(repeating: str[str.startIndex], count: 2) 
        + String(repeating: str[str.index(str.startIndex, offsetBy: 1)], count: 2) 
        + String(repeating: str[str.index(str.startIndex, offsetBy: 2)], count: 2)
    } else if !str.count.isMultiple(of: 2) || str.count > 8 {
      return nil
    }
    let scanner = Scanner(string: str)
    var color: UInt64 = 0
    scanner.scanHexInt64(&color)
    if str.count == 2 {
      let gray = Double(Int(color) & 0xFF) / 255
      self.init(.sRGB, red: gray, green: gray, blue: gray, opacity: 1)
    } else if str.count == 4 {
      let gray = Double(Int(color >> 8) & 0x00FF) / 255
      let alpha = Double(Int(color) & 0x00FF) / 255
      self.init(.sRGB, red: gray, green: gray, blue: gray, opacity: alpha)
    } else if str.count == 6 {
      let red = Double(Int(color >> 16) & 0x0000FF) / 255
      let green = Double(Int(color >> 8) & 0x0000FF) / 255
      let blue = Double(Int(color) & 0x0000FF) / 255
      self.init(.sRGB, red: red, green: green, blue: blue, opacity: 1)
    } else if str.count == 8 {
      let red = Double(Int(color >> 24) & 0x000000FF) / 255
      let green = Double(Int(color >> 16) & 0x000000FF) / 255
      let blue = Double(Int(color >> 8) & 0x000000FF) / 255
      let alpha = Double(Int(color) & 0x000000FF) / 255
      self.init(.sRGB, red: red, green: green, blue: blue, opacity: alpha)
    } else {
      return nil
    }
  }
}

And here are some sample colors, demonstrating all the supported formats:

let gray1 = Color("4f")
let gray2 = Color("#68")
let gray3 = Color("7813")
let red = Color("f00")
let translucentGreen = Color("#00FF0066")
let blue = Color("0000FF")
let invalid = Color("0000F")

Good Luck ;)


P
Patrick_K

I also used the solution for UIColor by hackingwithswift. This is an adapted version for Color:

init?(hex: String) {
    var hexSanitized = hex.trimmingCharacters(in: .whitespacesAndNewlines)
    hexSanitized = hexSanitized.replacingOccurrences(of: "#", with: "")

    var rgb: UInt64 = 0

    var red: Double = 0.0
    var green: Double = 0.0
    var blue: Double = 0.0
    var opacity: Double = 1.0

    let length = hexSanitized.count

    guard Scanner(string: hexSanitized).scanHexInt64(&rgb) else { return nil }

    if length == 6 {
        red = Double((rgb & 0xFF0000) >> 16) / 255.0
        green = Double((rgb & 0x00FF00) >> 8) / 255.0
        blue = Double(rgb & 0x0000FF) / 255.0

    } else if length == 8 {
        red = Double((rgb & 0xFF000000) >> 24) / 255.0
        green = Double((rgb & 0x00FF0000) >> 16) / 255.0
        blue = Double((rgb & 0x0000FF00) >> 8) / 255.0
        opacity = Double(rgb & 0x000000FF) / 255.0

    } else {
        return nil
    }

    self.init(.sRGB, red: red, green: green, blue: blue, opacity: opacity)
}

N
Norman

SwiftUI Color creation from hex (3, 4, 6, 8 characters) support for #, alpha, web constants, and UIColor constants. Usage examples below.

Swift Package iOS 14+ includes support for Color hex, random, CSS colors, and UserDefaults.

https://i.stack.imgur.com/qyNTN.png


I don't see Color(hex: in the docs nor in code completion.
@ScottyBlades Sorry about that. If you're using iOS 14 here's a package that will provide Color support for hex and UserDefaults as well. github.com/nbasham/BlackLabsSwiftUIColor
M
MMK

Usage
UIColor.init(hex: "f2000000")
UIColor.init(hex: "#f2000000")
UIColor.init(hex: "000000")
UIColor.init(hex: "#000000")

extension UIColor {
public convenience init(hex:String) {
var cString:String = hex.trimmingCharacters(in: .whitespacesAndNewlines).uppercased()
    
    if (cString.hasPrefix("#")) {
        cString.remove(at: cString.startIndex)
    }
    var r: CGFloat = 0.0
    var g: CGFloat = 0.0
    var b: CGFloat = 0.0
    var a: CGFloat = 1.0
    
    var rgbValue:UInt64 = 0
    Scanner(string: cString).scanHexInt64(&rgbValue)
    
    if ((cString.count) == 8) {
        r = CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0
        g =  CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0
        b = CGFloat((rgbValue & 0x0000FF)) / 255.0
        a = CGFloat((rgbValue & 0xFF000000)  >> 24) / 255.0
        
    }else if ((cString.count) == 6){
        r = CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0
        g =  CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0
        b = CGFloat((rgbValue & 0x0000FF)) / 255.0
        a =  CGFloat(1.0)
    }
    
    
    self.init(  red: r,
                green: g,
                blue: b,
                alpha: a
    )
} }

You are using a UIKit object: UIColor, not a SwiftUI Color.
S
Sawsan

You can use this extension for UIColor

extension UIColor {
    convenience init(hexaString: String, alpha: CGFloat = 1) {
        let chars = Array(hexaString.dropFirst())
        self.init(red:   .init(strtoul(String(chars[0...1]),nil,16))/255,
                  green: .init(strtoul(String(chars[2...3]),nil,16))/255,
                  blue:  .init(strtoul(String(chars[4...5]),nil,16))/255,
                  alpha: alpha)}
}

Usage Example:

let lightGoldColor  = UIColor(hexaString: "#D6CDB2")

Test Code:

https://i.stack.imgur.com/bX6IZ.png

Resource


关注公众号,不定期副业成功案例分享
Follow WeChat

Success story sharing

Want to stay one step ahead of the latest teleworks?

Subscribe Now